ResearchPad - heart-rate https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Aging-associated sinus arrest and sick sinus syndrome in adult zebrafish]]> https://www.researchpad.co/article/elastic_article_13853 Because of its powerful genetics, the adult zebrafish has been increasingly used for studying cardiovascular diseases. Considering its heart rate of ~100 beats per minute at ambient temperature, which is very close to human, we assessed the use of this vertebrate animal for modeling heart rhythm disorders such as sinus arrest (SA) and sick sinus syndrome (SSS). We firstly optimized a protocol to measure electrocardiogram in adult zebrafish. We determined the location of the probes, implemented an open-chest microsurgery procedure, measured the effects of temperature, and determined appropriate anesthesia dose and time. We then proposed an PP interval of more than 1.5 seconds as an arbitrary criterion to define an SA episode in an adult fish at ambient temperature, based on comparison between the current definition of an SA episode in humans and our studies of candidate SA episodes in aged wild-type fish and Tg(SCN5A-D1275N) fish (a fish model for inherited SSS). With this criterion, a subpopulation of about 5% wild-type fish can be considered to have SA episodes, and this percentage significantly increases to about 25% in 3-year-old fish. In response to atropine, this subpopulation has both common SSS phenotypic traits that are shared with the Tg(SCN5A-D1275N) model, such as bradycardia; and unique SSS phenotypic traits, such as increased QRS/P ratio and chronotropic incompetence. In summary, this study defined baseline SA and SSS in adult zebrafish and underscored use of the zebrafish as an alternative model to study aging-associated SSS.

]]>
<![CDATA[Right ventricular pressure overload directly affects left ventricular torsion mechanics in patients with precapillary pulmonary hypertension]]> https://www.researchpad.co/article/elastic_article_8470 This study examined the impact of septal flattening on left ventricular (LV) torsion in patients with precapillary pulmonary hypertension (PH). Fifty-two patients with proven precapillary PH and 13 healthy controls were included. Ventricular function was assessed including 4D-measurements, tissue velocity imaging, and speckle tracking analysis. Increased eccentricity index (1.39 vs. 1.08, p<0.001), systolic pulmonary artery pressure (64 vs. 29mmHg, p<0.001) and right ventricular Tei index (0.55 vs. 0.28, p = 0.007), and reduced tricuspid annular plane systolic excursion (19.0 vs. 26.5mm, p<0.001) were detected in PH patients as compared to controls. With increasing eccentricity of left ventricle, LV torsion was both decreased and delayed. Torsion rate paralleled this pattern of change during systole, but not during diastole. In conclusion, right ventricular pressure overload directly affects LV torsion mechanics. The echocardiographic methodology applied provides novel insights in the interrelation of right- and left ventricular function.

]]>
<![CDATA[Model based estimation of QT intervals in non-invasive fetal ECG signals]]> https://www.researchpad.co/article/elastic_article_7659 The end timing of T waves in fetal electrocardiogram (fECG) is important for the evaluation of ST and QT intervals which are vital markers to assess cardiac repolarization patterns. Monitoring malignant fetal arrhythmias in utero is fundamental to care in congenital heart anomalies preventing perinatal death. Currently, reliable detection of end of T waves is possible only by using fetal scalp ECG (fsECG) and fetal magnetocardiography (fMCG). fMCG is expensive and less accessible and fsECG is an invasive technique available only during intrapartum period. Another safer and affordable alternative is the non-invasive fECG (nfECG) which can provide similar assessment provided by fsECG and fMECG but with less accuracy (not beat by beat). Detection of T waves using nfECG is challenging because of their low amplitudes and high noise. In this study, a novel model-based method that estimates the end of T waves in nfECG signals is proposed. The repolarization phase has been modeled as the discharging phase of a capacitor. To test the model, fECG signals were collected from 58 pregnant women (age: (34 ± 6) years old) bearing normal and abnormal fetuses with gestational age (GA) 20-41 weeks. QT and QTc intervals have been calculated to test the level of agreement between the model-based and reference values (fsECG and Doppler Ultrasound (DUS) signals) in normal subjects. The results of the test showed high agreement between model-based and reference values (difference < 5%), which implies that the proposed model could be an alternative method to detect the end of T waves in nfECG signals.

]]>
<![CDATA[Role of intraoperative oliguria in risk stratification for postoperative acute kidney injury in patients undergoing colorectal surgery with an enhanced recovery protocol: A propensity score matching analysis]]> https://www.researchpad.co/article/N90678846-11a4-456d-84dc-7e3677d2f27e

Background

The enhanced recovery after surgery (ERAS) protocol for colorectal cancer resection recommends balanced perioperative fluid therapy. According to recent guidelines, zero-balance fluid therapy is recommended in low-risk patients, and immediate correction of low urine output during surgery is discouraged. However, several reports have indicated an association of intraoperative oliguria with postoperative acute kidney injury (AKI). We investigated the impact of intraoperative oliguria in the colorectal ERAS setting on the incidence of postoperative AKI.

Patients and methods

From January 2017 to August 2019, a total of 453 patients underwent laparoscopic colorectal cancer resection with the ERAS protocol. Among them, 125 patients met the criteria for oliguria and were propensity score (PS) matched to 328 patients without intraoperative oliguria. After PS matching had been performed, 125 patients from each group were selected and the incidences of AKI were compared between the two groups. Postoperative kidney function and surgical outcomes were also evaluated.

Results

The incidence of AKI was significantly higher in the intraoperative oliguria group than in the non-intraoperative oliguria group (26.4% vs. 11.2%, respectively, P = 0.002). Also, the eGFR reduction on postoperative day 0 was significantly greater in the intraoperative oliguria than non-intraoperative oliguria group (−9.02 vs. −1.24 mL/min/1.73 m2 respectively, P < 0.001). In addition, the surgical complication rate was higher in the intraoperative oliguria group than in the non-intraoperative oliguria group (18.4% vs. 9.6%, respectively, P = 0.045).

Conclusions

Despite the proven benefits of perioperative care with the ERAS protocol, caution is required in patients with intraoperative oliguria to prevent postoperative AKI. Further studies regarding appropriate management of intraoperative oliguria in association with long-term prognosis are needed in the colorectal ERAS setting.

]]>
<![CDATA[Student engagement, assessed using heart rate, shows no reset following active learning sessions in lectures]]> https://www.researchpad.co/article/Nf5514316-08af-4907-81a8-d6c8d0a0ef3a

Heart rate can be used as a measure of cognitive engagement. We measured average student heart rates during medical school lecture classes using wristwatch-style monitors. Analysis of 42 classes showed a steady decline in heart rate from the beginning to end of a lecture class. Active learning sessions (peer-discussion based problem solving) resulted in a significant uptick in heart rate, but this returned to the average level immediately following the active learning period. This is the first statistically robust assessment of changes in heart rate during the course of college lecture classes and indicates that personal heart rate monitors may be useful tools for assessment of different teaching modalities. The key findings suggest that the value of active learning within the classroom resides in the activity itself and not in an increase in engagement or reset in attention during the didactic period following an active learning session.

]]>
<![CDATA[A comparison study of anxiety in children undergoing brain MRI vs adults undergoing brain MRI vs children undergoing an electroencephalogram]]> https://www.researchpad.co/article/5c9902cbd5eed0c484b985cc

Background

Magnetic resonance imaging (MRI) of the brain in children and adolescents is a well-established method in both clinical practice and in neuroscientific research. This practice is sometimes viewed critically, as MRI scans might expose minors (e.g. through scan-associated fears) to more than the legally permissible “minimal burden”. While there is evidence that a significant portion of adults undergoing brain MRI scans experience anxiety, data on anxiety in children and adolescents undergoing brain MRI scans is rare. This study therefore aimed to examine the prevalence and level of anxiety in children and adolescents who had MRI scans of the brain, and to compare the results to adults undergoing brain MRI scans, and to children and adolescents undergoing electroencephalography (EEG; which is usually regarded a “minimal burden”).

Method

Participants were 57 children and adolescents who had a brain MRI scan (MRI-C; mean age 12.9 years), 28 adults who had a brain MRI scan (MRI-A; mean age 43.7 years), and 66 children and adolescents undergoing EEG (EEG-C; mean age 12.9 years). Anxiety was assessed on the subjective (situational anxiety) and on the physiological level (arousal), before and after the respective examination.

Results

More than 98% of children and adolescents reported no or only minimal fear during the MRI scan. Both pre- and post-examination, the MRI-C and the MRI-A groups did not differ significantly with respect to situational anxiety (p = 0.262 and p = 0.374, respectively), and to physiological arousal (p = 0.050, p = 0.472). Between the MRI-C and the EEG-C group, there were also no significant differences in terms of situational anxiety (p = 0.525, p = 0.875), or physiological arousal (p = 0.535, p = 0.189). Prior MRI experience did not significantly influence subjective or physiological anxiety parameters.

Conclusions

In this study, children and adolescents undergoing a brain MRI scan did not experience significantly more anxiety than those undergoing an EEG, or adults undergoing MRI scanning. Therefore, a general exclusion of minors from MRI research studies does not appear reasonable.

]]>
<![CDATA[The efficacy of stress reappraisal interventions on stress responsivity: A meta-analysis and systematic review of existing evidence]]> https://www.researchpad.co/article/5c803c54d5eed0c484ad86da

Background

The beliefs we hold about stress play an important role in coping with stressors. Various theoretical frameworks of stress point to the efficacy of reframing stress-related information through brief reappraisal interventions in order to promote adaptive coping.

Purpose

The goal of the current meta-analysis and systematic review is to substantiate the efficacy of reappraisal interventions on stress responsivity compared to control conditions. Differences in experimental methodologies (e.g., type of stressor used, timing of reappraisal intervention, and content of intervention instructions) will be examined to further delineate their effects on intervention outcomes.

Methods

The literature searches were conducted on May 16, 2018 using PsycINFO, ProQuest Dissertations and Theses, and PILOTS databases with no date restriction. The search terms included stress, reframing, reappraisal, mindset and reconceptualising. A total of 14 articles with 36 independent samples were included in the meta-analysis, while 22 articles with 46 independent samples were included in the systematic review. Random-effects model was used to test the null hypothesis using two-tailed significance testing. Fisher’s Z value was reported for each corresponding test. Heterogeneity tests are reported via Cochran’s Q-statistics.

Results

Findings from both the meta-analysis and systematic review revealed that overall, reappraisal interventions are effective in attenuating subjective responsivity to stress. Standard differences in means across groups are 0.429 (SE = 0.185, 95% CI = 0.067 to 0.791; z = 2.320, p = .020). However, reappraisal intervention groups did not outperform control groups on measures of physiological stress, with standard differences of -0.084 (SE = 0.135, 95% CI = -0.349 to 0.180; z = -0.627, p = .531). Moderator analysis revealed heterogeneous effects suggesting large variability in findings.

Conclusions

On one hand, findings may suggest a promising avenue for the effective management of self-reported stress and optimization of stress responses. However, more research is needed to better elucidate the effects, if any, of reappraisal interventions on stress physiology. Implications for the use of reappraisal interventions on stress optimization are discussed in the context of theoretical frameworks and considerations for future studies.

]]>
<![CDATA[Frequency-resolved analysis of coherent oscillations of local cerebral blood volume, measured with near-infrared spectroscopy, and systemic arterial pressure in healthy human subjects]]> https://www.researchpad.co/article/5c6c75b8d5eed0c4843d006b

We report a study on twenty-two healthy human subjects of the dynamic relationship between cerebral hemoglobin concentration ([HbT]), measured with near-infrared spectroscopy (NIRS) in the prefrontal cortex, and systemic arterial blood pressure (ABP), measured with finger plethysmography. [HbT] is a measure of local cerebral blood volume (CBV). We induced hemodynamic oscillations at discrete frequencies in the range 0.04–0.20 Hz with cyclic inflation and deflation of pneumatic cuffs wrapped around the subject’s thighs. We modeled the transfer function of ABP and [HbT] in terms of effective arterial (K(a)) and venous (K(v)) compliances, and a cerebral autoregulation time constant (τ(AR)). The mean values (± standard errors) of these parameters across the twenty-two subjects were K(a) = 0.01 ± 0.01 μM/mmHg, K(v) = 0.09 ± 0.05 μM/mmHg, and τ(AR) = 2.2 ± 1.3 s. Spatially resolved measurements in a subset of eight subjects reveal a spatial variability of these parameters that may exceed the inter-subject variability at a set location. This study sheds some light onto the role that ABP and cerebral blood flow (CBF) play in the dynamics of [HbT] measured with NIRS, and paves the way for new non-invasive optical studies of cerebral blood flow and cerebral autoregulation.

]]>
<![CDATA[Isolated diastolic potentials as predictors of success in ablation of right ventricular outflow tract idiopathic premature ventricular contractions]]> https://www.researchpad.co/article/5c648cdbd5eed0c484c8196e

Background and aims

Discrete potentials, low voltage and fragmented electrograms, have been previously reported at ablation site, in patients with premature ventricular contractions (PVCs) originating in the right ventricular outflow tract (RVOT). The aim of this study was to review the electrograms at ablation site and assess the presence of diastolic potentials and their association with success.

Methods

We retrospectively reviewed the electrograms obtained at the radiofrequency (RF) delivery sites of 48 patients subjected to ablation of RVOT frequent PVCs. We assessed the duration and amplitude of local electrogram, local activation time, and presence of diastolic potentials and fragmented electrograms.

Results

We reviewed 134 electrograms, median 2 (1–4) per patient. Success was achieved in 40 patients (83%). At successful sites the local activation time was earlier– 54 (-35 to -77) ms vs -26 (-12 to -35) ms, p<0.0001; the local electrogram had lower amplitude 1 (0.45–1.15) vs 1.5 (0.5–2.1) mV, p = 0.006, and longer duration 106 (80–154) vs 74 (60–90) ms, p<0.0001. Diastolic potentials and fragmented electrograms were more frequently present, respectively 76% vs 9%, p <0.0001 and 54% vs 11%, p<0.0001. In univariable analysis these variables were all associated with success. In multivariable analysis only the presence of diastolic potentials [OR 15.5 (95% CI: 3.92–61.2; p<0.0001)], and the value of local activation time [OR 1.11 (95% CI: 1.049–1.172 p<0.0001)], were significantly associated with success.

Conclusion

In this group of patients the presence of diastolic potentials at the ablation site was associated with success.

]]>
<![CDATA[The effects of heated humidification to nasopharynx on nasal resistance and breathing pattern]]> https://www.researchpad.co/article/5c648cc9d5eed0c484c817b1

Background

Mouth breathing could induce not only dry throat and eventually upper respiratory tract infection, but also snoring and obstructive sleep apnea, while nasal breathing is protective against those problems. Thus, one may want to explore an approach to modify habitual mouth breathing as preferable to nasal breathing. The aim of this study was to investigate the physiological effects of our newly developed mask on facilitation of nasal breathing.

Methods

Thirty seven healthy male volunteers were enrolled in a double blind, randomized, placebo-controlled crossover trial. Participants wore a newly developed heated humidification mask or non-heated-humidification mask (placebo) for 10-min each. Subjective feelings including dry nose, dry throat, nasal obstruction, ease to breathe, relaxation, calmness, and good feeling were asked before and after wearing each mask. In addition, the effects of masks on nasal resistance, breathing pattern, and heart rate variability were assessed.

Results

Compared with the placebo mask, the heated humidification mask improved all components of subjective feelings except for ease to breathe; moreover, decreased nasal resistance and respiratory frequency accompanied a simultaneous increase in a surrogate maker for tidal volume. However, use of the heated humidification mask did not affect heart rate variability

Conclusion

Adding heated humidification to the nasopharynx could modulate breathing patterns with improvement of subjective experience and objective nasal resistance.

]]>
<![CDATA[Habituation of the electrodermal response – A biological correlate of resilience?]]> https://www.researchpad.co/article/5c57e673d5eed0c484ef3263

Current approaches to quantifying resilience make extensive use of self-reported data. Problematically, this type of scales is plagued by response distortions–both deliberate and unintentional, particularly in occupational populations. The aim of the current study was to develop an objective index of resilience. The study was conducted in 30 young healthy adults. Following completion of the Connor-Davidson Resilience Scale (CD-RISC) and Depression/Anxiety/Stress Scale (DASS), they were subjected to a series of 15 acoustic startle stimuli (95 dB, 50 ms) presented at random intervals, with respiration, skin conductance and ECG recorded. As expected, resilience (CD-RISC) significantly and negatively correlated with all three DASS subscales–Depression (r = -0.66, p<0.0001), Anxiety (r = -0.50, p<0.005) and Stress (r = -0.48, p<0.005). Acoustic stimuli consistently provoked transient skin conductance (SC) responses, with SC slopes indexing response habituation. This slope significantly and positively correlated with DASS-Depression (r = 0.59, p<0.005), DASS-Anxiety (r = 0.35, p<0.05) and DASS-Total (r = 0.50, p<0.005) scores, and negatively with resilience score (r = -0.47; p = 0.006), indicating that high-resilience individuals are characterized by steeper habituation slopes compared to low-resilience individuals. Our key finding of the connection between habituation of the skin conductance responses to repeated acoustic startle stimulus and resilience-related psychometric constructs suggests that response habituation paradigm has the potential to characterize important attributes of cognitive fitness and well-being–such as depression, anxiety and resilience. With steep negative slopes reflecting faster habituation, lower depression/anxiety and higher resilience, and slower or no habituation characterizing less resilient individuals, this protocol may offer a distortion-free method for objective assessment and monitoring of psychological resilience.

]]>
<![CDATA[Effects of Nordic walking training on quality of life, balance and functional mobility in elderly: A randomized clinical trial]]> https://www.researchpad.co/article/5c5b5264d5eed0c4842bc750

Purpose

There is physiological and biomechanical evidence suggesting a possible advantage of using poles in walking training programs. The purpose of this proof-of-concept study was to test the hypothesis that untrained elderly training Nordic walking for eight weeks will show higher improvements on the functional mobility, quality of life and postural balance than that training without poles; more likely to occur in self-selected walking speed (primary outcome), and the locomotor rehabilitation index than the quality of life, the static balance and the dynamic stability. It was a two-arm randomized sample- and load-controlled study.

Methods

Thirty-three untrained older people were randomly assigned into Nordic walking (n = 16, age: 64.6±4.1 years old) and free walking (n = 17, age: 68.6±3.9 years old) training groups.

Results

Improvements in the self-selected walking speed (primary outcome, p = 0.011, ES = 0.42 95%CI -0.31 to 1.16), locomotor rehabilitation index (p = 0.013, ES = 0.36; (95%CI -0.39 to 1.10), quality of life (p<0.05), static balance (p<0.05) and dynamic variability (p<0.05) were found in both groups.

Conclusions

The hypothesis was not supported, our findings indicated that after 8 weeks, the Nordic walking training did not result in greater improvements than free walking training for the primary outcome (self-selected walking speed) and most of the secondary outcomes (including locomotor rehabilitation index, static balance, dynamic stability, and psychological and social participation domains of quality of life).

Trial registration

ClinicalTrials.gov NCT03096964.

]]>
<![CDATA[Dose-response relationship between very vigorous physical activity and cardiovascular health assessed by heart rate variability in adults: Cross-sectional results from the EPIMOV study]]> https://www.researchpad.co/article/5c5ca2c1d5eed0c48441ea3d

The minimum amount of physical activity needed to obtain health benefits has been widely determined. Unlikely, the impact of extreme amounts of very vigorous physical activity (VVPA, ≥ 8 metabolic equivalents) to the heart remains controversial. We aimed to evaluate the dose-response relationship between VVPA and heart rate variability (HRV) in adults. We selected 1040 asymptomatic individuals (60% women, 42 ± 15 years, 28 ± 6 kg/m2) from the Epidemiology and Human Movement Study (EPIMOV). Participants remained in the supine position for 10 min, and we selected an intermediate 5-min window for HRV analysis. The standard deviation of the RR intervals, root mean square of RR intervals, successive RR intervals that differ > 50 ms, powers of the low-and high-frequency bands and Poincaré plot standard deviations were quantified. Participants used a triaxial accelerometer (Actigraph GT3x+) above the dominant hip for 4–7 consecutive days for quantifying their physical activity. We also evaluated the maximum oxygen uptake (V˙O2max) during an exercise test. We stratified participants into five groups according to the VVPA in min/week (group 1, ≤ 1.50; 2, 1.51–3.16; 3, 3.17–3.54; 4, 3.55–20.75; and 5, > 20.75). The linear trends of the HRV through the quintiles of VVPA were investigated. We used logarithmic transformations to compare the five groups adjusted for age, sex, cardiovascular risk, and V˙O2max. We found a better HRV with increased VVPA for all HRV indices studied (p trend < 0.05). However, group 5 did not differ from group 4 (p > 0.05) for none of the indices. We conclude that there is an incremental benefit of VVPA on HRV of asymptomatic adults. Since we found neither additional benefits nor the harmful impact of amounts of VVPA as high as 22 min/week on HRV, our results should not discourage asymptomatic adults to perform VVPA.

]]>
<![CDATA[Predicting ambulatory energy expenditure in lower limb amputees using multi-sensor methods]]> https://www.researchpad.co/article/5c5ca28cd5eed0c48441e5dd

Purpose

To assess the validity of a derived algorithm, combining tri-axial accelerometry and heart rate (HR) data, compared to a research-grade multi-sensor physical activity device, for the estimation of ambulatory physical activity energy expenditure (PAEE) in individuals with traumatic lower-limb amputation.

Methods

Twenty-eight participants [unilateral (n = 9), bilateral (n = 10) with lower-limb amputations, and non-injured controls (n = 9)] completed eight activities; rest, ambulating at 5 progressive treadmill velocities (0.48, 0.67, 0.89, 1.12, 1.34m.s-1) and 2 gradients (3 and 5%) at 0.89m.s-1. During each task, expired gases were collected for the determination of V˙O2 and subsequent calculation of PAEE. An Actigraph GT3X+ accelerometer was worn on the hip of the shortest residual limb and, a HR monitor and an Actiheart (AHR) device were worn on the chest. Multiple linear regressions were employed to derive population-specific PAEE estimated algorithms using Actigraph GT3X+ outputs and HR signals (GT3X+HR). Mean bias±95% Limits of Agreement (LoA) and error statistics were calculated between criterion PAEE (indirect calorimetry) and PAEE predicted using GT3X+HR and AHR.

Results

Both measurement approaches used to predict PAEE were significantly related (P<0.01) with criterion PAEE. GT3X+HR revealed the strongest association, smallest LoA and least error. Predicted PAEE (GT3X+HR; unilateral; r = 0.92, bilateral; r = 0.93, and control; r = 0.91, and AHR; unilateral; r = 0.86, bilateral; r = 0.81, and control; r = 0.67). Mean±SD percent error across all activities were 18±14%, 15±12% and 15±14% for the GT3X+HR and 45±20%, 39±23% and 34±28% in the AHR model, for unilateral, bilateral and control groups, respectively.

Conclusions

Statistically derived algorithms (GT3X+HR) provide a more valid estimate of PAEE in individuals with traumatic lower-limb amputation, compared to a proprietary group calibration algorithm (AHR). Outputs from AHR displayed considerable random error when tested in a laboratory setting in individuals with lower-limb amputation.

]]>
<![CDATA[The Copenhagen Triage Algorithm is non-inferior to a traditional triage algorithm: A cluster-randomized study]]> https://www.researchpad.co/article/5c61e8bdd5eed0c48496f08e

Introduction

Triage systems with limited room for clinical judgment are used by emergency departments (EDs) worldwide. The Copenhagen Triage Algorithm (CTA) is a simplified triage system with a clinical assessment.

Methods

The trial was a non-inferiority, two-center cluster-randomized crossover study where CTA was compared to a local adaptation of Adaptive Process Triage (ADAPT). CTA involves initial categorization based on vital signs with a final modification based on clinical assessment by an ED nurse. We used 30-day mortality with a non-inferiority margin at 0.5%. Predictive performance was compared using Receiver Operator Characteristics.

Results

We included 45,347 patient visits, 23,158 (51%) and 22,189 (49%) were triaged with CTA and ADAPT respectively with a 30-day mortality of 3.42% and 3.43% (P = 0.996) a difference of 0.01% (95% CI: -0.34 to 0.33), which met the non-inferiority criteria. Mortality at 48 hours was 0.62% vs. 0.71%, (P = 0.26) and 6.38% vs. 6.61%, (P = 0.32) at 90 days for CTA and ADAPT. CTA triaged at significantly lower urgency level (P<0.001) and was superior in predicting 30-day mortality, Area under the curve: 0.67 (95% CI 0.65–0.69) compared to 0.64 for ADAPT (95% CI 0.62–0.66) (P = 0.03). There were no significant differences in rate of admission to the intensive care unit, length of stay, waiting time nor rate of readmission within 30 or 90 days.

Conclusion

A novel triage system based on vital signs and a clinical assessment by an ED nurse was non-inferior to a traditional triage algorithm by short term mortality, and superior in predicting 30-day mortality.

Trial registration

Clinicaltrials.gov NCT02698319

]]>
<![CDATA[Time course of tolerance to the performance benefits of caffeine]]> https://www.researchpad.co/article/5c521817d5eed0c4847972ee

The ergogenic effect of acute caffeine ingestion has been widely investigated; however, scientific information regarding tolerance to the performance benefits of caffeine, when ingested on a day-to-day basis, is scarce. The aim of this investigation was to determine the time course of tolerance to the ergogenic effects of a moderate dose of caffeine. Eleven healthy active participants took part in a cross-over, double-blind, placebo-controlled experiment. In one treatment, they ingested 3 mg/kg/day of caffeine for 20 consecutive days while in another they ingested a placebo for 20 days. Each substance was administered daily in an opaque unidentifiable capsule, and the experimental trials started 45 min after capsule ingestion. Two days before, and three times per week during each 20-day treatment, aerobic peak power was measured with an incremental test to volitional fatigue (25 W/min) and aerobic peak power was measured with an adapted version of the Wingate test (15 s). In comparison to the placebo, the ingestion of caffeine increased peak cycling power in the incremental exercise test by ~4.0 ±1.3% for the first 15 days (P<0.05) but then this ergogenic effect lessened. Caffeine also increased peak cycling power during the Wingate test on days 1, 4, 15, and 18 of ingestion by ~4.9 ±0.9% (P<0.05). In both tests, the magnitude of the ergogenic effect of caffeine vs. placebo was higher on the first day of ingestion and then progressively decreased. These results show a continued ergogenic effect with the daily ingestion of caffeine for 15–18 days; however, the changes in the magnitude of this effect suggest progressive tolerance.

]]>
<![CDATA[Outcomes of cardiac resynchronization therapy in patients with atrial fibrillation accompanied by slow ventricular response]]> https://www.researchpad.co/article/5c424385d5eed0c4845e0487

It remains unclear as to whether cardiac resynchronization therapy (CRT) would be as effective in patients with atrial fibrillation (AF) accompanied by slow ventricular response (AF-SVR, < 60 beats/min) as in those with sinus rhythm (SR). Echocardiographic reverse remodeling was compared between AF-SVR patients (n = 17) and those with SR (n = 88) at six months and 12 months after CRT treatment. We also evaluated the changes in QRS duration; New York Heart Association (NYHA) functional class; and long-term composite clinical outcomes including cardiac death, heart transplantation, and heart failure (HF)-related hospitalization. Left ventricular pacing sites and biventricular pacing percentages were not significantly different between the AF-SVR and SR groups. However, heart rate increase after CRT was significantly greater in the AF-SVR group than in the SR group (P < 0.001). At six and 12 months postoperation, both groups showed a comparable improvement in NYHA class; QRS narrowing; and echocardiographic variables including left ventricular end-systolic volume, left ventricular ejection fraction, and left atrial volume index. Over the median follow-up duration of 1.6 (interquartile range: 0.8–2.2) years, no significant between-group differences were observed regarding the rates of long-term composite clinical events (35% versus 24%; hazard ratio: 1.71; 95% confidence interval: 0.23–12.48; P = 0.60). CRT implantation provided comparable beneficial effects for patients with AF-SVR as compared with those with SR, by correcting electrical dyssynchrony and increasing biventricular pacing rate, in terms of QRS narrowing, symptom improvement, ventricular reverse remodeling, and long-term clinical outcomes.

]]>
<![CDATA[The value of vital sign trends in predicting and monitoring clinical deterioration: A systematic review]]> https://www.researchpad.co/article/5c478c7ed5eed0c484bd2aa6

Background

Vital signs, i.e. respiratory rate, oxygen saturation, pulse, blood pressure and temperature, are regarded as an essential part of monitoring hospitalized patients. Changes in vital signs prior to clinical deterioration are well documented and early detection of preventable outcomes is key to timely intervention. Despite their role in clinical practice, how to best monitor and interpret them is still unclear.

Objective

To evaluate the ability of vital sign trends to predict clinical deterioration in patients hospitalized with acute illness.

Data Sources

PubMed, Embase, Cochrane Library and CINAHL were searched in December 2017.

Study Selection

Studies examining intermittently monitored vital sign trends in acutely ill adult patients on hospital wards and in emergency departments. Outcomes representing clinical deterioration were of interest.

Data Extraction

Performed separately by two authors using a preformed extraction sheet.

Results

Of 7,366 references screened, only two were eligible for inclusion. Both were retrospective cohort studies without controls. One examined the accuracy of different vital sign trend models using discrete-time survival analysis in 269,999 admissions. One included 44,531 medical admissions examining trend in Vitalpac Early Warning Score weighted vital signs. They stated that vital sign trends increased detection of clinical deterioration. Critical appraisal was performed using evaluation tools. The studies had moderate risk of bias, and a low certainty of evidence. Additionally, four studies examining trends in early warning scores, otherwise eligible for inclusion, were evaluated.

Conclusions

This review illustrates a lack of research in intermittently monitored vital sign trends. The included studies, although heterogeneous and imprecise, indicates an added value of trend analysis. This highlights the need for well-controlled trials to thoroughly assess the research question.

]]>
<![CDATA[Aging and the relationships between long-axis systolic and early diastolic excursion, isovolumic relaxation time and left ventricular length—Implications for the interpretation of aging effects on e`]]> https://www.researchpad.co/article/5c3d0155d5eed0c48403a5b7

Background

Both the left ventricular (LV) long-axis peak early diastolic lengthening velocity (e`) and long-axis early diastolic excursion (EDExc) decrease with age, but the mechanisms underlying these decreases are not fully understood. The aim of this study was to investigate the relative contributions to aging-related decreases in e`and EDExc from LV long-axis systolic excursion (SExc), isovolumic relaxation time (IVRT, as a measure of the speed of relaxation) and LV end-diastolic length (LVEDL).

Methods

The study group was 50 healthy adult subjects of ages 17–75 years with a normal LV ejection fraction. SExc, EDExc, e`and IVRT were measured from pulsed wave tissue Doppler signals acquired from the septal and lateral walls. Multivariate modelling was performed to identify independent predictors of EDExc and e`which were consistent for the septal and lateral walls.

Results

EDExc decreased with age and the major determinant of EDExc was SExc, which also decreased with age. There was also a decrease of e`with age, and the major determinant of e`was EDExc. IVRT decreased with age and on univariate analysis was not only inversely correlated with EDExc and e`, but also with SExc. IVRT was only a minor contributor to models of EDExc which included SExc, and was an inconsistent contributor to models of e`which included EDExc. LVEDL decreased with age independent of sex and body size, and was positively correlated with SExc, EDExc and e`.

Conclusion

Major mechanisms underlying the decrease in e`seen during aging are the concomitant decreases in long-axis contraction and early diastolic excursion, which are in turn related in part to long-axis remodelling of the left ventricle. After adjusting for the extent of systolic and early diastolic excursion, slowing of relaxation, as reflected in prolongation of the IVRT, makes no more than a minor contribution to aging-related decreases in EDExc and e`.

]]>
<![CDATA[Comparing the delay with different anticoagulants before elective electrical cardioversion for atrial fibrillation/flutter]]> https://www.researchpad.co/article/5c37b7a2d5eed0c484490753

Aims

To assess the impact of the introduction of direct oral anticoagulants upon the outcomes from elective electrical cardioversion for atrial fibrillation.

Methods

This is a retrospective comparison of delay to elective cardioversion with different anticoagulants. The data was gathered from a large regional hospital from January 2013 to September 2017. There were 3 measured outcomes: 1) the time in weeks from referral to the date of attempted electrical cardioversion; 2) the proportion of patients who were successfully cardioverted; and 3) the proportion of patients who remained in sinus rhythm by the 12 week follow-up. Time-to-cardioversion was non-parametrically distributed so was analysed with Kruskal-Wallis testing and Mann-Whitney-U testing. Maintenance of sinus rhythm was analysed using z-testing.

Results

1,374 patients were submitted to cardioversion. The referrals for cardioversion were either from primary care or from cardiologists. At the time of cardioversion, 789 cases were anticoagulated on warfarin (W), 215 on apixaban (A) and 370 on rivaroxaban (R). All 3 cohorts were initially compared independently using Kruskal-Wallis testing. This demonstrated a significant difference in the delay (measured in weeks) between the A and W group (A = 7, W = 9, P<0.00001); the R and W group (R = 7, W = 9, P<0.00001) and no difference between R and A (A = 7, R = 7, P = 0.92). As there was no difference between the A and R groups, they were combined to form the AR group. The AR group was compared to the W group using Mann-Whitney-U testing which demonstrated a significant delay between the groups (AR = 7, W = 9, P<0.00001). Excluding patients with prior or unknown attempts of cardioversion (n = 791), the W patients (n = 152) were less successful in achieving sinus rhythm at cardioversion than the AR (n = 431) group (W = 95% vs AR = 99% P = 0.04). However at 12 weeks, incidence of sinus rhythm was significantly different (W = 40% vs AR = 49% P = 0.049). These groups were compared by z testing. At 12 weeks' follow-up there was no statistical difference in rate of adverse consequences between the AR group and the W group, but the rate of adverse consequences was too low to draw further conclusions.

Conclusion

DOACs appear to significantly shorten the latency between the decision to cardiovert and the cardioversion procedure by at least 2 weeks compared to warfarin in a real-world setting. In this study, patients who had not previously been cardioverted who were anticoagulated with warfarin had a significantly lower probability of conversion to sinus rhythm and a significantly lower probability to remain in sinus rhythm at the 12 week follow-up compared to the combined apixaban and rivaroxaban group.

]]>