This website requires JavaScript.
联系我们
Subclinical Atrial Fibrillation and the Risk of Stroke
作者:Jeff S. Healey / Stuart J. Connolly / Michael R. Gold / Carsten W. Israel / Isabelle C. Van Gelder / Alessandro Capucci / C. P. Lau / Eric Fain / Sean Yang / Christophe Bailleul / Carlos A. Morillo / Mark Carlson / Ellison Themeles / Elizabeth S. Kaufman / Stefan H. Hohnloser
One quarter of strokes are of unknown cause, and subclinical atrial fibrillation may be a common etiologic factor. Pacemakers can detect subclinical episodes of rapid atrial rate, which correlate with electrocardiographically documented atrial fibrillation. We evaluated whether subclinical episodes of rapid atrial rate detected by implanted devices were associated with an increased risk of ischemic stroke in patients who did not have other evidence of atrial fibrillation. Methods We enrolled 2580 patients, 65 years of age or older, with hypertension and no history of atrial fibrillation, in whom a pacemaker or defibrillator had recently been implanted. We monitored the patients for 3 months to detect subclinical atrial tachyarrhythmias (episodes of atrial rate >190 beats per minute for more than 6 minutes) and followed them for a mean of 2.5 years for the primary outcome of ischemic stroke or systemic embolism. Patients with pacemakers were randomly assigned to receive or not to receive continuous atrial overdrive pacing. Results By 3 months, subclinical atrial tachyarrhythmias detected by implanted devices had occurred in 261 patients (10.1%). Subclinical atrial tachyarrhythmias were associated with an increased risk of clinical atrial fibrillation (hazard ratio, 5.56; 95% confidence interval [CI], 3.78 to 8.17; P<0.001) and of ischemic stroke or systemic embolism (hazard ratio, 2.49; 95% CI, 1.28 to 4.85; P = 0.007). Of 51 patients who had a primary outcome event, 11 had had subclinical atrial tachyarrhythmias detected by 3 months, and none had had clinical atrial fibrillation by 3 months. The population attributable risk of stroke or systemic embolism associated with subclinical atrial tachyarrhythmias was 13%. Subclinical atrial tachyarrhythmias remained predictive of the primary outcome after adjustment for predictors of stroke (hazard ratio, 2.50; 95% CI, 1.28 to 4.89; P = 0.008). Continuous atrial overdrive pacing did not prevent atrial fibrillation. Conclusions Subclinical atrial tachyarrhythmias, without clinical atrial fibrillation, occurred frequently in patients with pacemakers and were associated with a significantly increased risk of ischemic stroke or systemic embolism. (Funded by St. Jude Medical; ASSERT ClinicalTrials.gov number, NCT00256152.)
PDF
0人发布笔记 · 2人收藏
加入收藏
Expanding consensus in portal hypertension Report of the Baveno VI Consensus Workshop: Stratifying risk and individualizing care for portal hypertension
作者:Roberto de Franchis
Portal hypertension is the haemodynamic abnormality associated with the most severe complications of cirrhosis, including ascites, hepatic encephalopathy and bleeding from gastroesophageal varices. Variceal bleeding is a medical emergency associated with a mortality that, in spite of recent progress, is still in the order of 10–20% at 6 weeks. The evaluation of diagnostic tools and the design and conduct of good clinical trials for the treatment of portal hypertension have always been difficult. Awareness of these difficulties has led to the organisation of a series of consensus meetings. The first one was organised by Andrew Burroughs in Groningen, the Netherlands in 1986 [1]. After Groningen, other meetings followed, in Baveno, Italy in 1990 (Baveno I) [2], and in 1995 (Baveno II) [3,4], in Milan, Italy in 1992 [5], in Reston, U.S.A. [6] in 1996, in Stresa, Italy, in 2000 (Baveno III) [7,8], again in Baveno in 2005 (Baveno IV) [9,10], in Atlanta in 2007 [11], and again in Stresa in 2010 (Baveno V) [12,13]. The aims of these meetings were to develop definitions of key events in portal hypertension and variceal bleeding, to review the existing evidence on the natural history, the diagnosis and the therapeutic modalities of portal hypertension, and to issue evidence-based recommendations for the conduct of clinical trials and the management of patients. All these meetings were successful and produced consensus statements on some important points, although several issues remained unsettled. To continue the work of the previous meetings, a Baveno VI workshop was held on April 10–11, 2015. The workshop was attended by many of the experts responsible for most of the major achievements of the last years in this field. Many of them had attended the previous meetings as well. A concept that has gained wide acceptance over the past few years is the fact that patients in different stages of cirrhosis have different risks of developing complications and of dying. Accordingly, the Baveno VI workshop was entitled ‘‘Stratifying
PDF
0人发布笔记 · 0人收藏
加入收藏
A new Definition of Fractional Derivative without Singular Kernel
作者:Michele Caputo / Mauro Fabrizio
In the paper, we present a new definition of fractional deriva tive with a smooth kernel which takes on two different representations for the temporal and spatial variable. The first works on the time variables; thus it is suitable to use th e Laplace transform. The second definition is related to the spatial va riables, by a non-local fractional derivative, for which it is more convenient to work with the Fourier transform. The interest for this new approach with a regular kernel was born from the prospect that there is a class of non-local systems, which have the ability to descri be the material heterogeneities and the fluctuations of diff erent scales, which cannot be well described by classical local theories or by fractional models with singular kernel.
PDF
0人发布笔记 · 0人收藏
加入收藏
Classification criteria for Sjögren's syndrome: a revised version of the European criteria proposed by the American-European Consensus Group
作者:C Vitali / S Bombardieri / R Jonsson / H M Moutsopoulos / E L Alexander / S E Carsons / T E Daniels / P C Fox / R I Fox / S S Kassan / S R Pillemer / N Talal / M H Weisman
Classification criteria for Sjogren's syndrome (SS) were developed and validated between 1989 and 1996 by the European Study Group on Classification Criteria for SS, and broadly accepted. These have been re-examined by consensus group members, who have introduced some modifications, more clearly defined the rules for classifying patients with primary or secondary SS, and provided more precise exclusion criteria.
PDF
0人发布笔记 · 1人收藏
加入收藏
Cost-effective outbreak detection in networks
作者:Jure Leskovec / Andreas Krause / Carlos Guestrin / Christos Faloutsos / Jeanne VanBriesen / Natalie Glance
Given a water distribution network, where should we place sensors toquickly detect contaminants? Or, which blogs should we read to avoid missing important stories?.These seemingly different problems share common structure: Outbreak detection can be modeled as selecting nodes (sensor locations, blogs) in a network, in order to detect the spreading of a virus or information asquickly as possible. We present a general methodology for near optimal sensor placement in these and related problems. We demonstrate that many realistic outbreak detection objectives (e.g., detection likelihood, population affected) exhibit the property of "submodularity". We exploit submodularity to develop an efficient algorithm that scales to large problems, achieving near optimal placements, while being 700 times faster than a simple greedy algorithm. We also derive online bounds on the quality of the placements obtained by any algorithm. Our algorithms and bounds also handle cases where nodes (sensor locations, blogs) have different costs.We evaluate our approach on several large real-world problems,including a model of a water distribution network from the EPA, andreal blog data. The obtained sensor placements are provably near optimal, providing a constant fraction of the optimal solution. We show that the approach scales, achieving speedups and savings in storage of several orders of magnitude. We also show how the approach leads to deeper insights in both applications, answering multicriteria trade-off, cost-sensitivity and generalization questions.
PDF
0人发布笔记 · 0人收藏
加入收藏
APACHE II: a severity of disease classification system.
作者:William A. Knaus / Elizabeth A. Draper / Douglas P. Wagner / Jack E. Zimmerman
This paper presents the form and validation results of APACHE II, a severity of disease classification system. APACHE II uses a point score based upon initial values of 12 routine physiologic measurements, age, and previous health status to provide a general measure of severity of disease. An increasing score (range 0 to 71) was closely correlated with the subsequent risk of hospital death for 5815 intensive care admissions from 13 hospitals. This relationship was also found for many common diseases. When APACHE II scores are combined with an accurate description of disease, they can prognostically stratify acutely ill patients and assist investigators comparing the success of new or differing forms of therapy. This scoring index can be used to evaluate the use of hospital resources and compare the efficacy of intensive care in different hospitals or over time.
0人发布笔记 · 0人收藏
加入收藏
A Survey on Human Activity Recognition using Wearable Sensors
作者:Oscar D. Lara / Miguel A. Labrador
Providing accurate and opportune information on people's activities and behaviors is one of the most important tasks in pervasive computing. Innumerable applications can be visualized, for instance, in medical, security, entertainment, and tactical scenarios. Despite human activity recognition (HAR) being an active field for more than a decade, there are still key aspects that, if addressed, would constitute a significant turn in the way people interact with mobile devices. This paper surveys the state of the art in HAR based on wearable sensors. A general architecture is first presented along with a description of the main components of any HAR system. We also propose a two-level taxonomy in accordance to the learning approach (either supervised or semi-supervised) and the response time (either offline or online). Then, the principal issues and challenges are discussed, as well as the main solutions to each one of them. Twenty eight systems are qualitatively evaluated in terms of recognition performance, energy consumption, obtrusiveness, and flexibility, among others. Finally, we present some open problems and ideas that, due to their high relevance, should be addressed in future research.
0人发布笔记 · 0人收藏
加入收藏
Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses
作者:Richard R. Hake
A survey of pre/post-test data using the Halloun–Hestenes Mechanics Diagnostic test or more recent Force Concept Inventory is reported for 62 introductory physics courses enrolling a total number of students N=6542. A consistent analysis over diverse student populations in high schools, colleges, and universities is obtained if a rough measure of the average effectiveness of a course in promoting conceptual understanding is taken to be the average normalized gain 〈g〉. The latter is defined as the ratio of the actual average gain (%〈post〉−%〈pre〉) to the maximum possible average gain (100−%〈pre〉). Fourteen “traditional” (T) courses (N=2084) which made little or no use of interactive-engagement (IE) methods achieved an average gain 〈g〉T-ave=0.23±0.04 (std dev). In sharp contrast, 48 courses (N=4458) which made substantial use of IE methods achieved an average gain 〈g〉IE-ave=0.48±0.14 (std dev), almost two standard deviations of 〈g〉IE-ave above that of the traditional courses. Results for 30 (N=3259) of the a...
PDF
0人发布笔记 · 0人收藏
加入收藏
Principles and practice of structural equation modeling, 3rd ed.
作者:Rex B. Kline
0人发布笔记 · 0人收藏
加入收藏
Parton distributions for the LHC
作者:AD Martin / William James Stirling / Robert S Thorne / G Watt
We present updated leading-order, next-to-leading order and next-to-next-to-leading order parton distribution functions (“MSTW 2008”) determined from global analysis of hard-scattering data within the standard framework of leading-twist fixed-order collinear factorisation in the $\overline{\mathrm{MS}}$ scheme. These parton distributions supersede the previously available “MRST” sets and should be used for the first LHC data taking and for the associated theoretical calculations. New data sets fitted include CCFR/NuTeV dimuon cross sections, which constrain the strange-quark and -antiquark distributions, and Tevatron Run II data on inclusive jet production, the lepton charge asymmetry from W decays and the Z rapidity distribution. Uncertainties are propagated from the experimental errors on the fitted data points using a new dynamic procedure for each eigenvector of the covariance matrix. We discuss the major changes compared to previous MRST fits, briefly compare to parton distributions obtained by other fitting groups, and give predictions for the W and Z total cross sections at the Tevatron and LHC.
PDF
0人发布笔记 · 0人收藏
加入收藏
选择感兴趣的领域
往期经典活动