ResearchPad - original-investigation Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Fatty liver index and development of cardiovascular disease in Koreans without pre-existing myocardial infarction and ischemic stroke: a large population-based study]]> Despite the known association between non-alcoholic fatty liver disease (NAFLD) and cardiovascular disease (CVD), whether NAFLD predicts future CVD events, especially CVD mortality, remains uncertain. We evaluated the relationship between fatty liver index (FLI), a validated marker of NAFLD, and risk of major adverse cardiac events (MACEs) in a large population-based study.MethodsWe identified 3011,588 subjects in the Korean National Health Insurance System cohort without a history of CVD who underwent health examinations from 2009 to 2011. The primary endpoint was a composite of cardiovascular deaths, non-fatal myocardial infarction (MI), and ischemic stroke. A Cox proportional hazards regression analysis was performed to assess association between the FLI and the primary endpoint.ResultsDuring the median follow-up period of 6 years, there were 46,010 cases of MACEs (7148 cases of cardiovascular death, 16,574 of non-fatal MI, and 22,288 of ischemic stroke). There was a linear association between higher FLI values and higher incidence of the primary endpoint. In the multivariable models adjusted for factors, such as body weight and cholesterol levels, the hazard ratio for the primary endpoint comparing the highest vs. lowest quartiles of the FLI was 1.99 (95% confidence interval [CIs], 1.91–2.07). The corresponding hazard ratios (95% CIs) for cardiovascular death, non-fetal MI, and ischemic stroke were 1.98 (1.9–2.06), 2.16 (2.01–2.31), and 2.01 (1.90–2.13), respectively (p < 0.001). The results were similar when we performed stratified analyses by age, sex, use of dyslipidemia medication, obesity, diabetes, and hypertension.ConclusionsOur findings indicate that the FLI, which is a surrogate marker of NAFLD, has prognostic value for detecting individuals at higher risk for cardiovascular events. ]]> <![CDATA[Prognostic importance of visit-to-visit blood pressure variability for micro- and macrovascular outcomes in patients with type 2 diabetes: The Rio de Janeiro Type 2 Diabetes Cohort Study]]> The prognostic importance of an increased visit-to-visit blood pressure variability (BP-VVV) for the future development of micro- and macrovascular complications in type 2 diabetes has been scarcely investigated and is largely unsettled. We aimed to evaluate it in a prospective long-term follow-up study with 632 individuals with type 2 diabetes.MethodsBP-VVV parameters (systolic and diastolic standard deviations [SD] and variation coefficients) were measured during the first 24-months. Multivariate Cox analysis, adjusted for risk factors and mean BP levels, examined the associations between BP-VVV and the occurrence of microvascular (retinopathy, microalbuminuria, renal function deterioration, peripheral neuropathy) and macrovascular complications (total cardiovascular events [CVEs], major adverse CVEs [MACE] and cardiovascular and all-cause mortality). Improvement in risk discrimination was assessed by the C-statistic and integrated discrimination improvement (IDI) index.ResultsOver a median follow-up of 11.3 years, 162 patients had a CVE (132 MACE), and 212 patients died (95 from cardiovascular diseases); 153 newly-developed or worsened diabetic retinopathy, 193 achieved the renal composite outcome (121 newly-developed microalbuminuria and 95 deteriorated renal function), and 171 newly-developed or worsened peripheral neuropathy. Systolic BP-VVV was an independent predictor of MACE (hazard ratio: 1.25, 95% CI 1.03–1.51 for a 1-SD increase in 24-month SD), but not of total CVEs, cardiovascular and all-cause mortality, and of any microvascular outcome. However, no BP-VVV parameter significantly improved cardiovascular risk discrimination (increase in C-statistic 0.001, relative IDI 0.9%).ConclusionsSystolic BP-VVV was an independent predictor of MACE, but it did not improve cardiovascular risk stratification. The goal of anti-hypertensive treatment in patients with type 2 diabetes shall remain in controlling mean BP levels, not on decreasing their visit-to-visit variability. ]]> <![CDATA[Impact of diabetes mellitus on mortality in patients with acute heart failure: a prospective cohort study]]> Although more than one-third of the patients with acute heart failure (AHF) have diabetes mellitus (DM), it is unclear if DM has an adverse impact on clinical outcomes. This study compared the outcomes in patients hospitalized for AHF stratified by DM and left ventricular ejection fraction (LVEF).MethodsThe Korean Acute Heart Failure registry prospectively enrolled and followed 5625 patients from March 2011 to February 2019. The primary endpoints were in-hospital and overall all-cause mortality. We evaluated the impact of DM on these endpoints according to HF subtypes and glycemic control.ResultsDuring a median follow-up of 3.5 years, there were 235 (4.4%) in-hospital mortalities and 2500 (46.3%) overall mortalities. DM was significantly associated with increased overall mortality after adjusting for potential confounders (adjusted hazard ratio [HR] 1.11, 95% confidence interval [CI] 1.03–1.22). In the subgroup analysis, DM was associated with higher a risk of overall mortality in heart failure with reduced ejection fraction (HFrEF) only (adjusted HR 1.14, 95% CI 1.02–1.27). Inadequate glycemic control (HbA1c ≥ 7.0% within 1 year after discharge) was significantly associated with a higher risk of overall mortality compared with adequate glycemic control (HbA1c < 7.0%) (44.0% vs. 36.8%, log-rank p = 0.016).ConclusionsDM is associated with a higher risk of overall mortality in AHF, especially HFrEF. Well-controlled diabetes (HbA1c < 7.0%) is associated with a lower risk of overall mortality compared to uncontrolled diabetes.Trial registration, NCT01389843. Registered July 6, 2011. ]]> <![CDATA[Use of Deep Learning to Develop and Analyze Computational Hematoxylin and Eosin Staining of Prostate Core Biopsy Images for Tumor Diagnosis]]> Histopathological diagnoses of tumors from tissue biopsy after hematoxylin and eosin (H&E) dye staining is the criterion standard for oncological care, but H&E staining requires trained operators, dyes and reagents, and precious tissue samples that cannot be reused.ObjectivesTo use deep learning algorithms to develop models that perform accurate computational H&E staining of native nonstained prostate core biopsy images and to develop methods for interpretation of H&E staining deep learning models and analysis of computationally stained images by computer vision and clinical approaches.Design, Setting, and ParticipantsThis cross-sectional study used hundreds of thousands of native nonstained RGB (red, green, and blue channel) whole slide image (WSI) patches of prostate core tissue biopsies obtained from excess tissue material from prostate core biopsies performed in the course of routine clinical care between January 7, 2014, and January 7, 2017, at Brigham and Women’s Hospital, Boston, Massachusetts. Biopsies were registered with their H&E-stained versions. Conditional generative adversarial neural networks (cGANs) that automate conversion of native nonstained RGB WSI to computational H&E-stained images were then trained. Deidentified whole slide images of prostate core biopsy and medical record data were transferred to Massachusetts Institute of Technology, Cambridge, for computational research. Results were shared with physicians for clinical evaluations. Data were analyzed from July 2018 to February 2019.Main Outcomes and MeasuresMethods for detailed computer vision image analytics, visualization of trained cGAN model outputs, and clinical evaluation of virtually stained images were developed. The main outcome was interpretable deep learning models and computational H&E-stained images that achieved high performance in these metrics.ResultsAmong 38 patients who provided samples, single core biopsy images were extracted from each whole slide, resulting in 102 individual nonstained and H&E dye–stained image pairs that were compared with matched computationally stained and unstained images. Calculations showed high similarities between computationally and H&E dye–stained images, with a mean (SD) structural similarity index (SSIM) of 0.902 (0.026), Pearson correlation coefficient (PCC) of 0.962 (0.096), and peak signal to noise ratio (PSNR) of 22.821 (1.232) dB. A second cGAN performed accurate computational destaining of H&E-stained images back to their native nonstained form, with a mean (SD) SSIM of 0.900 (0.030), PCC of 0.963 (0.011), and PSNR of 25.646 (1.943) dB compared with native nonstained images. A single blind prospective study computed approximately 95% pixel-by-pixel overlap among prostate tumor annotations provided by 5 board certified pathologists on computationally stained images, compared with those on H&E dye–stained images. This study also used the first visualization and explanation of neural network kernel activation maps during H&E staining and destaining of RGB images by cGANs. High similarities between kernel activation maps of computationally and H&E-stained images (mean-squared errors <0.0005) provide additional mathematical and mechanistic validation of the staining system.Conclusions and RelevanceThese findings suggest that computational H&E staining of native unlabeled RGB images of prostate core biopsy could reproduce Gleason grade tumor signatures that were easily assessed and validated by clinicians. Methods for benchmarking, visualization, and clinical validation of deep learning models and virtually H&E-stained images communicated in this study have wide applications in clinical informatics and oncology research. Clinical researchers may use these systems for early indications of possible abnormalities in native nonstained tissue biopsies prior to histopathological workflows. ]]> <![CDATA[Association of Homocysteine, Methionine, and <i>MTHFR</i> 677C&gt;T Polymorphism With Rate of Cardiovascular Multimorbidity Development in Older Adults in Sweden]]> Strong evidence links high total serum homocysteine (tHcy) and low methionine (Met) levels with higher risk of ischemic disease, but other cardiovascular (CV) diseases may also be associated with their pleiotropic effects.ObjectivesTo investigate the association of serum concentrations of tHcy and Met with the rate of CV multimorbidity development in older adults and to explore the role of methylenetetrahydrofolate reductase (MTHFR) 677C>T polymorphism in this association.Design, Setting, and ParticipantsThe Swedish National Study on Aging and Care in Kungsholmen is a cohort study of randomly selected individuals aged 60 years or older. The present study included data on 1969 individuals with complete information and without CV diseases at baseline, collected from the baseline examination (2001-2004) to the fourth follow-up (2013-2016). Data analysis was conducted from January to May 2019.ExposuresConcentrations of tHcy and Met were measured from nonfasting venous blood samples. The Met:tHcy ratio was considered a possible indicator of methylation activity. MTHFR status was dichotomized as any T carriers vs noncarriers.Main Outcome and MeasuresThe number of CV diseases at each wave was ascertained based on medical interviews and records, laboratory test results, and drug data. Linear mixed models were used to study the association of baseline tHcy and Met levels and the rate of CV multimorbidity development, adjusting for sociodemographic characteristics, CV risk factors, chronic disease burden, and drug use.ResultsOf 1969 participants, most were women (1261 [64.0%]), with a mean (SD) age of 70.9 (9.8) years; 1703 participants (86.6%) had at least a high school level of education. Baseline measurements of serum tHcy, Met, and the Met:tHcy ratio were associated with the rate of CV disease accumulation (tHcy: β = 0.023 per year; 95% CI, 0.015 to 0.030; P < .001; Met: β = −0.007 per year; 95% CI, −0.013 to −0.001; P = .02; Met:tHcy ratio: β = −0.017 per year; 95% CI, −0.023 to −0.011; P < .001). The association between low Met concentrations and the rate of CV multimorbidity development was restricted to the group with CT/TT alleles of MTHFR (β = 0.023 per year; 95% CI, 0.006 to 0.041; P = .009). Results remained largely significant when individual CV diseases were removed from the total count 1 at a time (eg, ischemic heart disease, tHcy: β = 0.023 per year; 95% CI, 0.013 to 0.027; P < .001; Met: β = −0.006 per year; 95% CI, −0.011 to −0.0003; P = .04; Met:tHcy ratio: β = −0.015 per year; 95% CI, −0.020 to −0.009; P < .001).Conclusions and RelevanceIn this study, high tHcy and low Met levels were associated with faster CV multimorbidity development in older age. The interactive association of Met concentrations and MTHFR polymorphism, together with the association found for the Met:tHcy ratio, point toward the relevance of impaired methylation in the pathogenesis of CV aging. ]]> <![CDATA[Assessment of a Prediction Model for Antidepressant Treatment Stability Using Supervised Topic Models]]> In the absence of readily assessed and clinically validated predictors of treatment response, pharmacologic management of major depressive disorder often relies on trial and error.ObjectiveTo assess a model using electronic health records to identify predictors of treatment response in patients with major depressive disorder.Design, Setting, and ParticipantsThis retrospective cohort study included data from 81 630 adults with a coded diagnosis of major depressive disorder from 2 academic medical centers in Boston, Massachusetts, including outpatient primary and specialty care clinics from December 1, 1997, to December 31, 2017. Data were analyzed from January 1, 2018, to March 15, 2020.ExposuresTreatment with at least 1 of 11 standard antidepressants.Main Outcomes and MeasuresStable treatment response, intended as a proxy for treatment effectiveness, defined as continued prescription of an antidepressant for 90 days. Supervised topic models were used to extract 10 interpretable covariates from coded clinical data for stability prediction. With use of data from 1 hospital system (site A), generalized linear models and ensembles of decision trees were trained to predict stability outcomes from topic features that summarize patient history. Held-out patients from site A and individuals from a second hospital system (site B) were evaluated.ResultsAmong the 81 630 adults (56 340 women [69%]; mean [SD] age, 48.46 [14.75] years; range, 18.0-80.0 years), 55 303 reached a stable response to their treatment regimen during follow-up. For held-out patients from site A, the mean area under the receiver operating characteristic curve (AUC) for discrimination of the general stability outcome was 0.627 (95% CI, 0.615-0.639) for the supervised topic model with 10 covariates. In evaluation of site B, the AUC was 0.619 (95% CI, 0.610-0.627). Building models to predict stability specific to a particular drug did not improve prediction of general stability even when using a harder-to-interpret ensemble classifier and 9256 coded covariates (specific AUC, 0.647; 95% CI, 0.635-0.658; general AUC, 0.661; 95% CI, 0.648-0.672). Topics coherently captured clinical concepts associated with treatment response.Conclusions and RelevanceThe findings suggest that coded clinical data available in electronic health records may facilitate prediction of general treatment response but not response to specific medications. Although greater discrimination is likely required for clinical application, the results provide a transparent baseline for such studies. ]]> <![CDATA[Association of Pharmacist Prescription With Dispensed Duration of Hormonal Contraception]]> Since 2016, 11 states have expanded the scope of pharmacists to include direct prescription of hormonal contraception. Dispensing greater than 1 month’s supply is associated with improved contraceptive continuation rates and fewer breaks in coverage. Scant data exist on the practice of pharmacist prescription of contraception and its outcomes compared with traditional, clinic-based prescriptions.ObjectiveTo compare the amount of hormonal contraceptive supply dispensed between pharmacists and clinic-based prescriptions. Prescribing patterns were assessed by describing prescribing practices for women with contraindications to combined hormonal contraception. Characteristics of women seeking hormonal contraception directly from pharmacists were also described.Design, Setting, and ParticipantsThis cohort study surveyed women aged 18 to 50 years who presented to pharmacies in California, Colorado, Hawaii, and Oregon for hormonal contraception prescribed by a clinician or a pharmacist between January 30 and November 1, 2019.ExposuresPharmacist or clinic-based prescription of contraception.Main Outcomes and MeasuresMonths of contraceptive supply dispensed.ResultsFour hundred ten women (mean [SD] age, 27.1 [7.7] years) were recruited who obtained contraception directly from a pharmacist (n = 144) or by traditional clinician prescription (n = 266). Women obtaining contraception from a pharmacist were significantly younger (82 [56.9%] vs 115 [43.2%] participants aged 18-24 years; P = .03), had less education (38 [26.4%] vs 100 [37.6%] with a bachelor degree; P = .002), and were more likely to be uninsured (16 [11.1%] vs 8 [3.0%] participants; P = .001) compared with women with a prescription from a clinician. Pharmacists were significantly more likely to prescribe a 6-month or greater supply of contraceptives than clinicians (6.9% vs 1.5%, P < .001) and significantly less likely to only prescribe a 1-month supply (42 [29.2%] vs 118 [44.4%] prescriptions; P < .001). Controlling for all covariates, women seen by pharmacists had higher odds of receipt of a 6-month or greater supply of contraceptives compared with those seen by clinicians (odds ratio = 3.55; 95% CI, 1.88-6.70). Pharmacists were as likely as clinicians to prescribe a progestin-only method to women with a potential contraindication to estrogen (n = 60 women; 8 [20.0%] vs 6 [30.0%], P = .52).Conclusions and RelevanceThese findings suggest that pharmacist prescription of contraception may be associated with improved contraceptive continuation by preventing breaks in coverage through the provision of a greater supply of medication. Efforts are needed to educate prescribing providers on the importance of dispensing 6 months or greater contraceptive supply. ]]> <![CDATA[Dopaminergic contributions to behavioral control under threat of punishment in rats]]> Excessive intake of rewards, such as food and drugs, often has explicit negative consequences, including the development of obesity and addiction, respectively. Thus, choosing not to pursue reward is the result of a cost/benefit decision, proper execution of which requires inhibition of behavior. An extensive body of preclinical and clinical evidence implicates dopamine in certain forms of inhibition of behavior, but it is not fully known how it contributes to behavioral inhibition under threat of explicit punishment.ObjectivesTo assess the involvement of midbrain dopamine neurons and their corticostriatal output regions, the ventral striatum and prefrontal cortex, in control over behavior under threat of explicit (foot shock) punishment in rats.MethodsWe used a recently developed behavioral inhibition task, which assesses the ability of rats to exert behavioral restraint at the mere sight of food reward, under threat of foot shock punishment. Using in vivo fiber photometry, chemogenetics, c-Fos immunohistochemistry, and behavioral pharmacology, we investigated how dopamine neurons in the ventral tegmental area, as well as its output areas, the ventral striatum and prefrontal cortex, contribute to behavior in this task.ResultsUsing this multidisciplinary approach, we found little evidence for a direct involvement of ascending midbrain dopamine neurons in inhibitory control over behavior under threat of punishment. For example, photometry recordings suggested that VTA DA neurons do not directly govern control over behavior in the task, as no differences were observed in neuronal population activity during successful versus unsuccessful behavioral control. In addition, chemogenetic and pharmacological manipulations of the mesocorticolimbic DA system had little or no effect on the animals’ ability to exert inhibitory control over behavior. Rather, the dopamine system appeared to have a role in the motivational components of reward pursuit.ConclusionsTogether, our data provide insight into the mesocorticolimbic mechanisms behind motivated behaviors by showing a modulatory role of dopamine in the expression of cost/benefit decisions. In contrast to our expectations, dopamine did not appear to directly mediate the type of behavioral control that is tested in our task.Electronic supplementary materialThe online version of this article (10.1007/s00213-020-05497-w) contains supplementary material, which is available to authorized users. ]]> <![CDATA[Effects of 5-HT<sub>2C</sub>, 5-HT<sub>1A</sub> receptor challenges and modafinil on the initiation and persistence of gambling behaviours]]> Problematic patterns of gambling are characterised by loss of control and persistent gambling often to recover losses. However, little is known about the mechanisms that mediate initial choices to begin gambling and then continue to gamble in the face of losing outcomes.ObjectivesThese experiments first assessed gambling and loss-chasing performance under different win/lose probabilities in C57Bl/6 mice, and then investigated the effects of antagonism of 5-HT2CR with SB242084, 5-HT1AR agonism with 8-OH-DPAT and modafinil, a putative cognitive enhancer.ResultsAs seen in humans and other species, mice demonstrated the expected patterns of behaviour as the odds for winning were altered increasing gambling and loss-chasing when winning was more likely. SB242084 decreased the likelihood to initially gamble, but had no effects on subsequent gambling choices in the face of repeated losses. In contrast, 8-OH-DPAT had no effects on choosing to gamble in the first place, but once started 8-OH-DPAT increased gambling choices in a dose-sensitive manner. Modafinil effects were different to the serotonergic drugs in both decreasing the propensity to initiate gambling and chase losses.ConclusionsWe present evidence for dissociable effects of systemic drug administration on different aspects of gambling behaviour. These data extend and reinforce the importance of serotonergic mechanisms in mediating discrete components of gambling behaviour. They further demonstrate the ability of modafinil to reduce gambling behaviour. Our work using a novel mouse paradigm may be of utility in modelling the complex psychological and neurobiological underpinnings of gambling problems, including the analysis of genetic and environmental factors.Electronic supplementary materialThe online version of this article (10.1007/s00213-020-05496-x) contains supplementary material, which is available to authorized users. ]]> <![CDATA[Chronic harmine treatment has a delayed effect on mobility in control and socially defeated rats]]> Depression is characterized by behavioral, cognitive and physiological changes, imposing a major burden on the overall wellbeing of the patient. Some evidence indicates that social stress, changes in growth factors (e.g., brain-derived neurotrophic factor (BDNF)), and neuroinflammation are involved in the development and progression of the disease. The monoamine oxidase A inhibitor drug harmine was suggested to have both antidepressant and anti-inflammatory properties and may, therefore, be a potential candidate for treatment of depression.AimThe goal of this study was to assess the effects of harmine on behavior, brain BDNF levels, and microglia activation in control rats and a rat model of social stress.Material and methodsRats were submitted to 5 consecutive days of repeated social defeat (RSD) or control conditions. Animals were treated daily with harmine (15 mg/kg) or vehicle from day 3 until the end of the experiment. To assess the effects of harmine treatment on behavior, the sucrose preference test (SPT) was performed on days 1, 6, and 15, the open field test (OFT) on days 6 and 14, and the novel object recognition test (NOR) on day 16. Brain microgliosis was assessed using [11C]PBR-28 PET on day 17. Animals were terminated on day 17, and BDNF protein concentrations in the hippocampus and frontal cortex were analyzed using ELISA.ResultsRSD significantly decreased bodyweight and increased anxiety and anhedonia-related parameters in the OFT and SPT on day 6, but these behavioral effects were not observed anymore on day 14/15. Harmine treatment caused a significant reduction in bodyweight gain in both groups, induced anhedonia in the SPT on day 6, and significantly reduced the mobility and exploratory behavior of the animals in the OFT mainly on day 14. PET imaging and the NOR test did not show any significant effects on microglia activation and memory, respectively. BDNF protein concentrations in the hippocampus and frontal cortex were not significantly affected by either RSD or harmine treatment.DiscussionHarmine was not able to reverse the acute effects of RSD on anxiety and anhedonia and even aggravated the effect of RSD on bodyweight loss. Moreover, harmine treatment caused unexpected side effects on general locomotion, both in RSD and control animals, but did not influence glial activation status and BDNF concentrations in the brain. In this model, RSD-induced stress was not strong enough to induce long-term effects on the behavior, neuroinflammation, or BDNF protein concentration. Thus, the efficacy of harmine treatment on these delayed parameters needs to be further evaluated in more severe models of chronic stress. ]]> <![CDATA[Pro-cognitive effect of 1MeTIQ on recognition memory in the ketamine model of schizophrenia in rats: the behavioural and neurochemical effects]]> Schizophrenia is a mental illness which is characterised by positive and negative symptoms and by cognitive impairments. While the major prevailing hypothesis is that altered dopaminergic and/or glutamatergic transmission contributes to this disease, there is evidence that the noradrenergic system also plays a role in its major symptoms.ObjectivesIn the present paper, we investigated the pro-cognitive effect of 1-methyl-1,2,3,4-tetrahydroisoquinoline (1MeTIQ) an endogenous neuroprotective compound, on ketamine-modelled schizophrenia in rats.MethodsWe used an antagonist of NMDA receptors (ketamine) to model memory deficit symptoms in rats. Using the novel object recognition (NOR) test, we investigated the pro-cognitive effect of 1MeTIQ. Additionally, olanzapine, an atypical antipsychotic drug, was used as a standard to compare the pro-cognitive effects of the substances. In vivo microdialysis studies allowed us to verify the changes in the release of monoamines and their metabolites in the rat striatum.ResultsOur study demonstrated that 1MeTIQ, similarly to olanzapine, exhibits a pro-cognitive effect in NOR test and enhances memory disturbed by ketamine treatment. Additionally, in vivo microdialysis studies have shown that ketamine powerfully increased noradrenaline release in the rat striatum, while 1MeTIQ and olanzapine completely antagonised this neurochemical effect.Conclusions1MeTIQ, as a possible pro-cognitive drug, in contrast to olanzapine, expresses beneficial neuroprotective activity in the brain, increasing concentration of the extraneuronal dopamine metabolite, 3-methoxytyramine (3-MT), which plays an important physiological role in the brain as an inhibitory regulator of catecholaminergic activity. Moreover, we first demonstrated the essential role of noradrenaline release in memory disturbances observed in the ketamine-model of schizophrenia, and its possible participation in negative symptoms of the schizophrenia. ]]> <![CDATA[Interaction between behavioral inhibition and neural alcohol cue-reactivity in ADHD and alcohol use disorder]]> Compared to the general population, adult Attention-Deficit / Hyperactivity Disorder (ADHD) is more prevalent in patients with Alcohol Use Disorder (AUD). Impaired behavioral inhibition is a common characteristic in both ADHD and AUD. Relapse risk is increased in patients with AUD and comorbid, untreated ADHD and in AUD patients with increased neural cue-reactivity.ObjectivesIn this study, we examined the interaction between neural correlates of behavioral inhibition and alcohol cue-reactivity with a hybrid imaging task.MethodsOut of 69 adult study participants, we included n = 49 in our final analyses: Individuals had a diagnosis of either AUD (n = 13), ADHD (n = 14) or both (n = 5), or were healthy controls (HC; n = 17). The functional magnetic resonance imaging paradigm aimed to examine the combined effects of both an interference-inhibition task (“Simon-task”) and an alcohol cue-reactivity task. Instead of segregating by diagnostic group, we pursued a dimensional approach in which we compared measures of AUD and ADHD severity, as well as the interaction of both, using multiple regression analyses.ResultsThe four groups did not differ on the behavioral level on either the inhibition task or the alcohol cue-reactivity task. However, brain activation in frontal control and reward-related regions during completion of the combined tasks were related to ADHD and AUD severity (symptom load). During presentation of both alcohol cues and the inhibition task, participants with higher AUD and ADHD symptom load exhibited greater BOLD (blood oxygen level dependent) responses in subcortical reward-related regions.ConclusionsOur findings support the hypothesis that ADHD additionally diminishes inhibition ability in individuals with AUD. This may increase relapse risk when confronted with alcohol cues. Further, it is crucial for patients with comorbid AUD and ADHD to take into account not only reduced cognitive control over behavioral inhibition but also simultaneously heightened alcohol cue-reactivity.Electronic supplementary materialThe online version of this article (10.1007/s00213-020-05492-1) contains supplementary material, which is available to authorized users. ]]> <![CDATA[Relationship between ABO blood groups and cardiovascular disease in type 1 diabetes according to diabetic nephropathy status]]> ABO blood groups have previously been associated with cardiovascular disease (CVD) in the general population. This study aimed to investigate the potential relationship between ABO blood groups and CVD in individuals with type 1 diabetes according to diabetic nephropathy (DN) status.MethodsAdults with type 1 diabetes (4531 individuals) from the FinnDiane Study were evaluated. DN was determined by two out of three measurements of urinary albumin excretion rate. Albuminuria was defined as an excretion rate above 20 µg/min. CVD events were identified by linking the data with the Finnish Care Register for Health Care and the Finnish Cause of Death Register. Follow-up ranged from the baseline visit until a CVD event, death or the end of 2017. The impact of ABO blood groups on CVD risk was estimated by multivariable Cox-regression analyses adjusted for traditional risk factors.ResultsAt baseline, the median age was 38.5 (IQR 29.2–47.9) years, 47.5% were female and median duration of diabetes was 20.9 (11.4–30.7) years. There were 893 incident ischemic heart disease (IHD) events, 301 ischemic strokes (IS), and 415 peripheral artery disease (PAD) events during a median follow up of 16.5 (IQR 12.8–18.6) years. The A blood group showed the highest risk of IHD versus the O blood group, when microalbuminuria was present. Comparing the population with microalbuminuria with those with normoalbuminuria, only the A blood group elevated the risk of IHD. This increased risk was neither explained by the FUT2 secretor phenotype nor by the A-genotype distribution. The risk of IS or PAD was no different among the ABO blood groups regardless of diabetic nephropathy stage.ConclusionThe A blood group is a risk factor for IHD in individuals with type 1 diabetes and microalbuminuria. ]]> <![CDATA[Time-varying and dose-dependent effect of long-term statin use on risk of type 2 diabetes: a retrospective cohort study]]> We evaluated the effect of statin use on new-onset type 2 diabetes among individuals without atherosclerotic cardiovascular disease (ASCVD) using nationally representative South Korean claims data (2002–2013, N = 1,016,820).MethodsA total of 13,698 patients (statin users 5273, non-statin users 5273) aged 40–74 years, newly diagnosed with dyslipidemia but without any history of diabetes or ASCVD, were selected in 2005. We followed up the final sample until 2013 and evaluated the cumulative incidence of type 2 diabetes. We used extended Cox regression models to estimate the time-varying adjusted hazard ratios of statin use on new-onset type 2 diabetes. We performed further analyses based on the cumulative defined daily dose of statin received per year to evaluate the degree of risk compared to non-statin users.ResultsOver the mean follow-up period of 7.1 years, 3034 patients developed type 2 diabetes; the number of statin users exceeded that of non-users, demonstrating that statin use significantly increased the risk of new-onset type 2 diabetes. The risk of new-onset type 2 diabetes differed among statin users according to cDDD per year (adjusted HR = 1.31 [95% CI 1.18–1.46] for less than 30 cDDD per year; 1.58 [1.43–1.75] for 30–120 cDDD per year; 1.83 [1.62–2.08] for 120–180 cDDD per year; and 2.83 [2.51–3.19] for more than 180 cDDD per year). The diabetogenic effect of pitavastatin was not statistically significant, but the risk was the largest for atorvastatin. Long-term exposure (≥ 5 years) to statins was associated with a statistically significant increase in the risk of new onset type 2 diabetes in all statin subtypes explored, with the highest magnitude for simvastatin (HR = 1.916, 95% CI 1.647–2.228) followed by atorvastatin (HR = 1.830, 95% CI 1.487–2.252).ConclusionsStatin use was significantly associated with an increased risk of new-onset type 2 diabetes. We also found a dose–response relationship in terms of statin use duration and dose maintenance. Periodic screening and monitoring for incident type 2 diabetes may be warranted in long-term statin users. ]]> <![CDATA[Empagliflozin prevents doxorubicin-induced myocardial dysfunction]]> Empagliflozin showed efficacy in controlling glycaemia, leading to reductions in HbA1c levels, weight loss and blood pressure, compared to standard treatment. Moreover, the EMPA-REG OUTCOME trial demonstrated a 14% reduction of major adverse cardiovascular events (MACE), a 38% reduction in cardiovascular (CV) death and a 35% reduction in the hospitalization rate for heart failure (HF). These beneficial effect on HF were apparently independent from glucose control. However, no mechanistic in vivo studies are available to explain these results, yet. We aimed to determine the effect of empagliflozin on left ventricular (LV) function in a mouse model of doxorubicin-induced cardiomyopathy (DOX-HF).MethodsMale C57Bl/6 mice were randomly assigned to the following groups: controls (CTRL, n = 7), doxorubicin (DOX, n = 14), DOX plus empagliflozin (DOX + EMPA, n = 14), or DOX plus furosemide (DOX + FURO group, n = 7). DOX was injected intraperitoneally. LV function was evaluated at baseline and after 6 weeks of treatment using high-resolution echocardiography with 2D speckle tracking (Vevo 2100). Histological assessment was obtained using Haematoxylin and Eosin and Masson’s Goldner staining.ResultsA significant decrease in both systolic and diastolic LV function was observed after 6 weeks of treatment with doxorubicin. EF dropped by 32% (p = 0.002), while the LS was reduced by 42% (p < 0.001) and the CS by 50% (p < 0.001). However, LV function was significantly better in the DOX + EMPA group, both in terms of EF (61.30 ± 11% vs. 49.24 ± 8%, p = 0.007), LS (− 17.52 ± 3% vs. − 13.93 ± 5%, p = 0.04) and CS (− 25.75 ± 6% vs. − 15.91 ± 6%, p < 0.001). Those results were not duplicated in the DOX + FURO group. Hearts from the DOX + EMPA group showed a 50% lower degree of myocardial fibrosis, compared to DOX mice (p = 0.03). No significant differences were found between the DOX + FURO and the DOX group (p = 0.103).ConclusionEmpagliflozin attenuates the cardiotoxic effects exerted by doxorubicin on LV function and remodelling in nondiabetic mice, independently of glycaemic control. These findings support the design of clinical studies to assess their relevance in a clinical setting. ]]> <![CDATA[Facedown Positioning Following Surgery for Large Full-Thickness Macular Hole]]> This randomized clinical trial examines the effect of postoperative facedown positioning on outcomes for large macular holes.

<![CDATA[Impact of Telephone-Based Care Coordination on Use of Cessation Medications Posthospital Discharge: A Randomized Controlled Trial]]> Smokers benefit from ongoing cessation support upon leaving the hospital and returning to their home environment. This study examined the impact of telephone-delivered care coordination on utilization of and adherence to cessation pharmacotherapy after hospital discharge.Methods:Inpatient smokers (n = 606) were randomized to receive counseling with care coordination (CCC) or counseling alone (C) for smoking cessation. Both groups received written materials and telephone-based cessation counseling during hospitalization and postdischarge. CCC recipients received help in selecting, obtaining, and refilling affordable pharmacotherapy prescriptions during and after hospitalization. Study outcomes included self-reported utilization, duration of use, and type of medication during the 3 months postdischarge.Results:Of the 487 (80%) of participants completing 3-month follow-up, 211 (43.3%) reported using cessation pharmacotherapy postdischarge; this did not differ by study arm (CCC: 44.7%, C: 42.0%,p = .55). Use of pharmacotherapy postdischarge was associated with smoking at least 20 cigarettes/day at baseline (odds ratio [OR]: 1.48; 95% confidence interval [CI]: 1.00–2.19) and receipt of pharmacotherapy during hospitalization (OR: 4.00; 95% CI: 2.39–6.89). Smokers with Medicaid (OR: 2.29; 95% CI: 1.32–4.02) or other insurance (OR: 1.69; 95% CI: 1.01–2.86) were more likely to use pharmacotherapy postdischarge than those with no health care coverage. Less than one in four (23.8% of CCC; 22.2% of C) continued pharmacotherapy beyond 4 weeks.Conclusions:Supplemental care coordination did not improve use of postdischarge pharmacotherapy beyond that of inpatient treatment and behavioral counseling. Insurance coverage and use of medications during the hospitalization are associated with higher use of evidence-based treatment postdischarge.Implications:Many hospitalized smokers do not receive the benefits of cessation pharmacotherapy postdischarge and telephone quitline programs often fail to help smokers procure pharmacotherapy. Thus, effective strategies are needed to improve utilization and adherence to evidence-based cessation therapies when smokers leave the hospital. We found that use of postdischarge pharmacotherapy was strongly associated with receipt of pharmacotherapy during the hospitalization and with the availability of insurance to cover the costs of treatment. Additional efforts to coordinate pharmacotherapy services did not improve either utilization or adherence to therapy. ]]> <![CDATA[Impact of type 2 diabetes mellitus on mid-term mortality for hypertrophic cardiomyopathy patients who underwent septal myectomy]]> Type 2 diabetes mellitus is common in cardiovascular disease. It is associated with adverse clinical outcomes for patients who had undergone coronary artery bypass and valve operations. The aim of this study was to evaluate the impact of type 2 diabetes mellitus on the midterm outcomes of patients with hypertrophic cardiomyopathy who underwent septal myectomy.MethodsWe retrospectively analyzed the data of 67 hypertrophic cardiomyopathy patients with type 2 diabetes mellitus who underwent septal myectomy from two medical centers in China from 2011 to 2018. A propensity score–matched cohort of 134 patients without type 2 diabetes mellitus was also analyzed.ResultsDuring a median follow-up of 28.0 (interquartile range: 13.0–3.0) months, 9 patients died. The cause of death of all of these patients was cardiovascular, particularly sudden cardiac death in 3 patients. Patients with type 2 diabetes mellitus had a higher rate of sudden cardiac death (4.5% vs. 0.0%, p = 0.04). The Kaplan–Meier survival analysis revealed that the rates of predicted 3-year survival free from cardiovascular death (98.1% vs. 95.1%, p = 0.14) were similar between the two groups. However, the rates of predicted 3-year survival free from sudden cardiac death (100% vs. 96.7%, p = 0.01) were significantly higher in hypertrophic cardiomyopathy patients without type 2 diabetes mellitus than in those with type 2 diabetes mellitus. Furthermore, after adjustment for age and sex, only N-terminal pro-brain natriuretic peptide (hazards ratio: 1.002, 95% confidence interval: 1.000–1.005, p = 0.02) and glomerular filtration rate ≤ 80 ml/min (hazards ratio: 3.23, 95% confidence interval: 1.34–7.24, p = 0.047) were independent risk factors for hypertrophic cardiomyopathy patients with type 2 diabetes mellitus.ConclusionsHypertrophic cardiomyopathy patients with and without type 2 diabetes mellitus have similar 3-year cardiovascular mortality after septal myectomy. However, type 2 diabetes mellitus is associated with higher sudden cardiac death rate in these patients. In addition, N-terminal pro-brain natriuretic peptide and glomerular filtration rate ≤ 80 ml/min were independent risk factors among hypertrophic cardiomyopathy patients with type 2 diabetes mellitus. ]]> <![CDATA[Association of Individual-Level Factors With Visual Outcomes in Optic Neuritis]]> Using corticosteroids to treat acute demyelinating optic neuritis has been identified as an area for shared decision-making. However, no analysis exists to support personalized shared decision-making that considers long- and short-term treatment benefits.ObjectiveTo develop models of individual-level visual outcomes for patients with optic neuritis.Design, Setting, and ParticipantsThis secondary analysis of the Optic Neuritis Treatment Trial (ONTT), a randomized clinical trial, was performed at 14 academic eye centers and 1 large community eye center. Adults aged 18 to 46 years with incident acute unilateral optic neuritis within 8 days of vision loss onset were included. Data were collected from July 1988 to June 1991, downloaded on October 15, 2018, and analyzed from January 24, 2019, to February 20, 2020, using multivariable linear regression modeling.ExposuresIntravenous corticosteroids vs placebo.Main Outcomes and MeasuresVisual acuity (VA) at 1 year. Secondary outcomes were 1-year contrast sensitivity (CS) and VA and CS at 15 and 30 days. Independent variables included age, sex, race, multiple sclerosis status, optic neuritis episodes in the fellow eye, vision symptoms (days), pain, optic disc swelling, viral illness, treatment group, and baseline VA or CS.ResultsOf the 455 participants, median age was 31.8 (interquartile range [IQR], 26.3-37.0) years; 350 (76.9%) were women; and 388 (85.3%) were white. For 410 participants (90.1%) with 1-year outcomes, median VA improved from 20/66 (IQR, 20/28-20/630) at enrollment to 20/17 (IQR, 20/14-20/21) at 1 year. Baseline VA was the primary variable associated with 1-year VA (regression coefficient, 0.056 [95% CI, 0.008-0.103]; P = .02) if baseline VA was better than count fingers (CF). At 15 days, baseline VA and treatment status were associated with VA in those participants with baseline VA better than CF (regression coefficient, 0.305 [95% CI, 0.231-0.380]; F = 9.42; P < .001). However, the difference of medians (20/18 [95% CI, 20/17-20/19] with intravenous corticosteroids vs 20/23 [95% CI, 20/21-20/26] with placebo) was small for the median VA (20/66) in the trial. Treatment was not associated with 15-day or 1-year VA in participants with baseline VA of CF or worse.Conclusions and RelevanceIn this study, long-term VA was associated with severity of baseline vision loss. Early benefits with intravenous corticosteroid treatment were limited to participants with baseline VA better than CF. However, the early, temporary benefit of intravenous corticosteroids is of questionable clinical significance and should be weighed against potential harms. ]]> <![CDATA[Mode of Death Among Japanese Adults With Heart Failure With Preserved, Midrange, and Reduced Ejection Fraction]]> Despite intensive treatment, hospitalized patients with acute decompensated heart failure (ADHF) have a substantial risk of postdischarge mortality. Limited data are available on the possible differences in the incidence and mechanisms of death among patients with heart failure with reduced ejection fraction (HFrEF), heart failure with midrange ejection fraction (HFmrEF), and heart failure with preserved ejection fraction (HFpEF).ObjectivesTo examine the incidences and mode of postdischarge mortality among patients with ADHF and to compare the risk profile among patients with HFrEF, HFmrEF, and HFpEF.Design, Setting, and ParticipantsThis prospective cohort study of 4056 patients hospitalized for ADHF analyzed data from 3717 patients who were discharged from October 1, 2014, to March 31, 2016. Data analysis was performed from April 1 to August 31, 2019.ExposuresDeath among patients with ADHF after hospital discharge.Main Outcomes and MeasuresAll-cause death and cause of postdischarge mortality after the index hospitalization by left ventricular ejection fraction (LVEF) subgroup.ResultsA total of 3717 patients (mean [SD] age, 77.7 [12.0] years; 2049 [55.1%] male) were included in the study. The mean (SD) LVEF at baseline was 46.4% (16.2%). Among 3717 enrolled patients, 1383 (37.2%) were categorized as having HFrEF (LVEF, <40%), 703 (18.9%) as having HFmrEF (LVEF, 40%-49%), and 1631 (43.9%) as having HFpEF (LVEF, ≥50%). The incidence and causes of death were evaluated after discharge from the index hospitalization. The median follow-up period was 470 days (interquartile range, 357-649 days), and the 1-year follow-up rate was 96%. During follow-up, all-cause death occurred in 848 patients (22.8%; HFrEF group: 298 [21.5%; 95% CI, 19.5%-23.8%]; HFmrEF group: 158 [22.5%; 95% CI, 19.5%-25.7%]; and HRpEF group: 392 [24.0%; 95% CI, 22.0%-26.2%]; P = .26), cardiovascular deaths occurred in 523 patients (14.1%; HFrEF group: 203 [14.7%; 95% CI, 12.9%-16.6%]; HFmrEF group: 97 [13.8%; 95% CI, 11.4%-16.5%]; and HFpEF group: 223 [13.7%; 95% CI, 12.1%-15.4%]; P = .71), and sudden cardiac death occurred in 98 patients (2.6%; HFrEF group: 44 [3.2%; 95% CI, 2.4%-4.2%]; HFmrEF group: 14 [2.0%; 95% CI, 1.2%-3.3%]; and HFpEF group: 40 [2.5%; 95% CI, 1.8%-3.3%]; P = .23). The risks of causes of death were similar among the subtypes.Conclusions and RelevanceThe mode of death was similar among the heart failure subtypes. Given the nonnegligible incidence of sudden cardiac death in patients with HFpEF found in this study, further studies appear to be warranted to identify a high-risk subset in this population. ]]>