ResearchPad - cohort-studies https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Psychological symptoms and quality of life after repeated exposure to earthquake: A cohort study in Italy]]> https://www.researchpad.co/article/elastic_article_13809 In 2005, a random sample of 200 people were assessed in Camerino, Italy, eight years after an earthquake. Psychological symptom levels were low and only one person had current Post-Traumatic Stress Disorder (PTSD). In 2016 a new earthquake occurred in Camerino. The study aims to assess the impact of the second exposure in the same cohort. A longitudinal study was conducted, 130 participants were re-interviewed between July and December 2017. Psychological symptoms were self-rated on the Brief Symptom Inventory (BSI) and the Global Severity Index (GSI) was analysed. Post-traumatic stress symptoms were self-rated on the Impact of Event Scale-Revised (IES-R). Subjective quality of life (SQOL) was assessed on the Manchester Short Assessment of Quality of Life (MANSA). Mean scores of GSI and IES-R were significantly higher than in 2005 (p<0.01 and p<0.001), whilst SQOL remained almost unchanged (p = 0.163). In 2017, 16.9% of the sample had reached the PTSD threshold whilst in 2005 only the 0.5% had reached it. Despite low symptom levels several years after an earthquake, people can show psychological distress after a new exposure, whilst average quality of life levels are not affected.

]]>
<![CDATA[Incidence and determinants of Implanon discontinuation: Findings from a prospective cohort study in three health zones in Kinshasa, DRC]]> https://www.researchpad.co/article/elastic_article_7634 Kinshasa is Africa's third largest city and one of the continent’s most rapidly growing urban areas. PMA2020 data showed that Kinshasa has a modern contraceptive prevalence of 26.5% among married women in 2018. In Kinshasa’s method mix, the contraceptive implant recently became the dominant method among contraceptive users married and in union. This study provides insight into patterns of implant use in a high-fertility setting by evaluating the 24-month continuation rate for Implanon NXT and identifying the characteristics associated with discontinuation.MethodologyThis community-based, prospective cohort study followed 531 Implanon users aged 18–49 years at 6, 12 and 24 months. The following information was collected: socio-demographic characteristics, Method Information Index (MII) and contraceptive history. The main outcome variable for this study was implant discontinuation. The incidence rate of discontinuation is presented as events per 1000 person/months (p-m), from the date of enrolment. The Cox proportional hazards modelling was used to measure predictors of discontinuation.ResultsA total of 9158.13 p-m were available for analysis, with an overall incidence rate of 9.06 (95% CI: 9.04–9.08) removals per 1000 p-m. Of nine possible co-variates tested, the likelihood of discontinuation was higher among women who lived in military camps, had less than three children, never used injectables or implants in the past, had experienced heavy/prolonged bleeding, and whose MII score was less than 3.ConclusionIn addition to four client characteristics that predicted discontinuation, we identified one programmatic factor: quality of counseling as measured by the Method Information Index. Community providers in similar contexts should pay more attention to clients having less than three children, new adopters, and to clients living military camps as underserved population, where clients have less access to health facilities. More targeted counselling and follow-up is needed, especially on bleeding patterns. ]]> <![CDATA[Is it time to stop sweeping data cleaning under the carpet? A novel algorithm for outlier management in growth data]]> https://www.researchpad.co/article/N6ac4201b-e1d9-4dac-b706-1c6b88e127a6

All data are prone to error and require data cleaning prior to analysis. An important example is longitudinal growth data, for which there are no universally agreed standard methods for identifying and removing implausible values and many existing methods have limitations that restrict their usage across different domains. A decision-making algorithm that modified or deleted growth measurements based on a combination of pre-defined cut-offs and logic rules was designed. Five data cleaning methods for growth were tested with and without the addition of the algorithm and applied to five different longitudinal growth datasets: four uncleaned canine weight or height datasets and one pre-cleaned human weight dataset with randomly simulated errors. Prior to the addition of the algorithm, data cleaning based on non-linear mixed effects models was the most effective in all datasets and had on average a minimum of 26.00% higher sensitivity and 0.12% higher specificity than other methods. Data cleaning methods using the algorithm had improved data preservation and were capable of correcting simulated errors according to the gold standard; returning a value to its original state prior to error simulation. The algorithm improved the performance of all data cleaning methods and increased the average sensitivity and specificity of the non-linear mixed effects model method by 7.68% and 0.42% respectively. Using non-linear mixed effects models combined with the algorithm to clean data allows individual growth trajectories to vary from the population by using repeated longitudinal measurements, identifies consecutive errors or those within the first data entry, avoids the requirement for a minimum number of data entries, preserves data where possible by correcting errors rather than deleting them and removes duplications intelligently. This algorithm is broadly applicable to data cleaning anthropometric data in different mammalian species and could be adapted for use in a range of other domains.

]]>
<![CDATA[Long term outcomes and prognostics of visceral leishmaniasis in HIV infected patients with use of pentamidine as secondary prophylaxis based on CD4 level: a prospective cohort study in Ethiopia]]> https://www.researchpad.co/article/5c784fedd5eed0c48400792b

Background

The long-term treatment outcome of visceral leishmaniasis (VL) patients with HIV co-infection is complicated by a high rate of relapse, especially when the CD4 count is low. Although use of secondary prophylaxis is recommended, it is not routinely practiced and data on its effectiveness and safety are limited.

Methods

A prospective cohort study was conducted in Northwest Ethiopia from August 2014 to August 2017 (NCT02011958). HIV-VL patients were followed for up to 12 months. Patients with CD4 cell counts below 200/μL at the end of VL treatment received pentamidine prophylaxis starting one month after parasitological cure, while those with CD4 count ≥200 cells/μL were followed without secondary prophylaxis. Compliance, safety and relapse-free survival, using Kaplan-Meier analysis methods to account for variable time at risk, were summarised. Risk factors for relapse or death were analysed.

Results

Fifty-four HIV patients were followed. The probability of relapse-free survival at one year was 50% (95% confidence interval [CI]: 35–63%): 53% (30–71%) in 22 patients with CD4 ≥200 cells/μL without pentamidine prophylaxis and 46% (26–63%) in 29 with CD4 <200 cells/μL who started pentamidine. Three patients with CD4 <200 cells/μL did not start pentamidine. Amongst those with CD4 ≥200 cells/μL, VL relapse was an independent risk factor for subsequent relapse or death (adjusted rate ratio: 5.42, 95% CI: 1.1–25.8). Except for one case of renal failure which was considered possibly related to pentamidine, there were no drug-related safety concerns.

Conclusion

The relapse-free survival rate for VL patients with HIV was low. Relapse-free survival of patients with CD4 count <200cells/μL given pentamidine secondary prophylaxis appeared to be comparable to patients with a CD4 count ≥200 cells/μL not given prophylaxis. Patients with relapsed VL are at higher risk for subsequent relapse and should be considered a priority for secondary prophylaxis, irrespective of their CD4 count.

]]>
<![CDATA[How is women’s demand for caesarean section measured? A systematic literature review]]> https://www.researchpad.co/article/5c89779cd5eed0c4847d317e

Background

Caesarean section rates are increasing worldwide, and since the 2000s, several researchers have investigated women’s demand for caesarean sections.

Question

The aim of this article was to review and summarise published studies investigating caesarean section demand and to describe the methodologies, outcomes, country characteristics and country income levels in these studies.

Methods

This is a systematic review of studies published between 2000 and 2017 in French and English that quantitatively measured women’s demand for caesarean sections. We carried out a systematic search using the Medline database in PubMed.

Findings

The search strategy identified 390 studies, 41 of which met the final inclusion criteria, representing a total sample of 3 774 458 women. We identified two different study designs, i.e., cross-sectional studies and prospective cohort studies, that are commonly used to measure social demand for caesarean sections. Two different types of outcomes were reported, i.e., the preferences of pregnant or non-pregnant women regarding the method of childbirth in the future and caesarean delivery following maternal request. No study measured demand for caesarean section during the childbirth process. All included studies were conducted in middle- (n = 24) and high-income countries (n = 17), and no study performed in a low-income country was found.

Discussion

Measuring caesarean section demand is challenging, and the structural violence leading to demand for caesarean section during childbirth while in the labour ward remains invisible. In addition, the caesarean section demand in low-income countries remains unclear due to the lack of studies conducted in these countries.

Conclusion

We recommend conducting prospective cohort studies to describe the social construction of caesarean section demand. We also recommend conducting studies in low-income countries because demand for caesarean sections in these countries is rarely investigated.

]]>
<![CDATA[Late-life mortality is underestimated because of data errors]]> https://www.researchpad.co/article/5c65dcdbd5eed0c484dec3bf

Knowledge of true mortality trajectory at extreme old ages is important for biologists who test their theories of aging with demographic data. Studies using both simulation and direct age validation found that longevity records for ages 105 years and older are often incorrect and may lead to spurious mortality deceleration and mortality plateau. After age 105 years, longevity claims should be considered as extraordinary claims that require extraordinary evidence. Traditional methods of data cleaning and data quality control are just not sufficient. New, more strict methodologies of data quality control need to be developed and tested. Before this happens, all mortality estimates for ages above 105 years should be treated with caution.

]]>
<![CDATA[Use of non-insulin diabetes medicines after insulin initiation: A retrospective cohort study]]> https://www.researchpad.co/article/5c6dc9a1d5eed0c484529f41

Background

Clinical guidelines recommend that metformin be continued after insulin is initiated among patients with type 2 diabetes, yet little is known regarding how often metformin or other non-insulin diabetes medications are continued in this setting.

Methods

We conducted a retrospective cohort study to characterize rates and use patterns of six classes of non-insulin diabetes medications: biguanides (metformin), sulfonylureas, thiazolidinediones (TZDs), glucagon-like peptide 1 receptor agonists (GLP1 receptor agonists), dipeptidyl peptidase 4 inhibitors (DPP4 inhibitors), and sodium-glucose co-transporter inhibitors (SGLT2 inhibitors), among patients with type 2 diabetes initiating insulin. We used the 2010–2015 MarketScan Commercial Claims and Encounters data examining 72,971 patients with type 2 diabetes aged 18–65 years old who initiated insulin and had filled a prescription for a non-insulin diabetes medication in the 90 days prior to insulin initiation. Our primary outcome was the proportion of patients refilling the various non-insulin diabetes medications during the first 90 days after insulin initiation. We also used time-to-event analysis to characterize the time to discontinuation of specific medication classes.

Results

Metformin was the most common non-insulin medication used prior to insulin initiation (N = 53,017, 72.7%), followed by sulfonylureas (N = 25,439, 34.9%) and DPP4 inhibitors (N = 8,540, 11.7%). More than four out of five patients (N = 65,902, 84.7%) refilled prescriptions for any non-insulin diabetes medications within 90 days after insulin initiation. Within that period, metformin remained the most common medication with the highest continuation rate of 84.6%, followed by SGLT2 inhibitors (81.9%) and TZDs (79.3%). Sulfonylureas were the least likely medications to be continued (73.6% continuation) though they remained the second most common medication class used after insulin initiation. The median time to discontinuation varied by therapeutic class from the longest time to discontinuation of 26.4 months among metformin users to the shortest (3.0 months) among SGLT2 inhibitor users.

Conclusion

While metformin was commonly continued among commercially insured adults starting insulin, rates of continuation of other non-insulin diabetes medications were also high. Further studies are needed to determine the comparative effectiveness and safety of continuing insulin secretagogues and newer diabetes medications after insulin initiation.

]]>
<![CDATA[Acute rhinosinusitis among pediatric patients with allergic rhinitis: A nationwide, population-based cohort study]]> https://www.researchpad.co/article/5c6c75e2d5eed0c4843d03bf

Background

While chronic rhinosinusitis is a common complication of allergic rhinitis, the link between acute rhinosinusitis and allergic rhinitis is unclear. The aim of this study was to evaluate the risk of incident acute rhinosinusitis among pediatric patients with allergic rhinitis, using a nationwide, population-based health claims research database.

Methods

Newly diagnosed allergic rhinitis patients aged 5–18 years were identified from the health claim records of the Longitudinal Health Insurance Database 2000 of Taiwan’s National Health Insurance Research Database. A comparison cohort was assembled by randomly selecting patients from the same database with frequency matching by sex, age group, and index year. All patients were followed until a diagnosis of acute rhinosinusitis or the end of the follow-up period. Cox proportional hazards model was used to assess the association between allergic rhinitis and acute rhinosinusitis.

Results

Of the 43,588 pediatric patients included in this study, 55.4% were male and 43.9% were between the ages of 5.0–7.9 years. The risk of acute rhinosinusitis was significantly higher in pediatric patients with allergic rhinitis compared to those without the condition (adjusted hazard ratio = 3.03, 95% confidence interval = 2.89–3.18). Similar hazard ratios were observed between male and female pediatric patients.

Conclusions

This secondary cohort study using a nationwide, population-based health claim data of the Taiwan’s NHIRD showed that allergic rhinitis was significantly associated with a higher risk of acute rhinosinusitis among pediatric patients.

]]>
<![CDATA[Health outcomes for Australian Aboriginal and Torres Strait Islander children born preterm, low birthweight or small for gestational age: A nationwide cohort study]]> https://www.researchpad.co/article/5c76fe08d5eed0c484e5b2ea

Objective

To examine health outcomes in Australian Aboriginal and Torres Strait Islander children experiencing perinatal risk and identify protective factors in the antenatal period.

Methods

Baby/Child cohorts of the Longitudinal Study of Indigenous Children, born 2001–2008, across four annual surveys (aged 0–8 years, N = 1483). Children with ‘mild’ and ‘moderate-to-high’ perinatal risk were compared to children born normal weight at term for maternal-rated global health and disability, and body-mass-index measured by the interviewer.

Results

Almost one third of children had experienced mild (22%) or moderate-to-high perinatal risk (8%). Perinatal risk was associated with lower body-mass-index z-scores (regression coefficients adjusted for pregnancy and environment factors: mild = -0.21, 95% CI = -0.34, -0.07; moderate-to-high = -0.42, 95% CI = -0.63, -0.21). Moderate-to-high perinatal risk was associated with poorer global health, with associations becoming less evident in models adjusted for pregnancy and environment factors; but not evident for disability. A range of protective factors, including cultural-based resilience and smoking cessation, were associated with lower risk of adverse outcomes.

Conclusions

Perinatal risks are associated with Australian Aboriginal and Torres Strait children experiencing adverse health particularly lower body weight. Cultural-based resilience and smoking cessation may be two modifiable pathways to ameliorating health problems associated with perinatal risk.

]]>
<![CDATA[Community-, facility-, and individual-level outcomes of a district mental healthcare plan in a low-resource setting in Nepal: A population-based evaluation]]> https://www.researchpad.co/article/5c6f148bd5eed0c48467a299

Background

In low-income countries, care for people with mental, neurological, and substance use (MNS) disorders is largely absent, especially in rural settings. To increase treatment coverage, integration of mental health services into community and primary healthcare settings is recommended. While this strategy is being rolled out globally, rigorous evaluation of outcomes at each stage of the service delivery pathway from detection to treatment initiation to individual outcomes of care has been missing.

Methods and findings

A combination of methods were employed to evaluate the impact of a district mental healthcare plan for depression, psychosis, alcohol use disorder (AUD), and epilepsy as part of the Programme for Improving Mental Health Care (PRIME) in Chitwan District, Nepal. We evaluated 4 components of the service delivery pathway: (1) contact coverage of primary care mental health services, evaluated through a community study (N = 3,482 combined for all waves of community surveys) and through service utilisation data (N = 727); (2) detection of mental illness among participants presenting in primary care facilities, evaluated through a facility study (N = 3,627 combined for all waves of facility surveys); (3) initiation of minimally adequate treatment after diagnosis, evaluated through the same facility study; and (4) treatment outcomes of patients receiving primary-care-based mental health services, evaluated through cohort studies (total N = 449 depression, N = 137; AUD, N = 175; psychosis, N = 95; epilepsy, N = 42). The lack of structured diagnostic assessments (instead of screening tools), the relatively small sample size for some study components, and the uncontrolled nature of the study are among the limitations to be noted. All data collection took place between 15 January 2013 and 15 February 2017. Contact coverage increased 7.5% for AUD (from 0% at baseline), 12.2% for depression (from 0%), 11.7% for epilepsy (from 1.3%), and 50.2% for psychosis (from 3.2%) when using service utilisation data over 12 months; community survey results did not reveal significant changes over time. Health worker detection of depression increased by 15.7% (from 8.9% to 24.6%) 6 months after training, and 10.3% (from 8.9% to 19.2%) 24 months after training; for AUD the increase was 58.9% (from 1.1% to 60.0%) and 11.0% (from 1.1% to 12.1%) for 6 months and 24 months, respectively. Provision of minimally adequate treatment subsequent to diagnosis for depression was 93.9% at 6 months and 66.7% at 24 months; for AUD these values were 95.1% and 75.0%, respectively. Changes in treatment outcomes demonstrated small to moderate effect sizes (9.7-point reduction [d = 0.34] in AUD symptoms, 6.4-point reduction [d = 0.43] in psychosis symptoms, 7.2-point reduction [d = 0.58] in depression symptoms) at 12 months post-treatment.

Conclusions

These combined results make a promising case for the feasibility and impact of community- and primary-care-based services delivered through an integrated district mental healthcare plan in reducing the treatment gap and increasing effective coverage for MNS disorders. While the integrated mental healthcare approach does lead to apparent benefits in most of the outcome metrics, there are still significant areas that require further attention (e.g., no change in community-level contact coverage, attrition in AUD detection rates over time, and relatively low detection rates for depression).

]]>
<![CDATA[HIV virologic failure and its predictors among HIV-infected adults on antiretroviral therapy in the African Cohort Study]]> https://www.researchpad.co/article/5c633958d5eed0c484ae6500

Introduction

The 2016 WHO consolidated guidelines on the use of antiretroviral drugs defines HIV virologic failure for low and middle income countries (LMIC) as plasma HIV-RNA ≥ 1000 copies/mL. We evaluated virologic failure and predictors in four African countries.

Materials and methods

We included HIV-infected participants on a WHO recommended antiretroviral therapy (ART) regimen and enrolled in the African Cohort Study between January 2013 and October 2017. Studied outcomes were virologic failure (plasma HIV-RNA ≥ 1000 copies/mL at the most recent visit), viraemia (plasma HIV-RNA ≥ 50 copies/mL at the most recent visit); and persistent viraemia (plasma HIV-RNA ≥ 50 copies/mL at two consecutive visits). Generalized linear models were used to estimate relative risks with their 95% confidence intervals.

Results

2054 participants were included in this analysis. Viraemia, persistent viraemia and virologic failure were observed in 396 (19.3%), 160 (7.8%) and 184 (9%) participants respectively. Of the participants with persistent viraemia, only 57.5% (92/160) had confirmed virologic failure. In the multivariate analysis, attending clinical care site other than the Uganda sitebeing on 2nd line ART (aRR 1.8, 95% CI 1·28–2·66); other ART combinations not first line and not second line (aRR 3.8, 95% CI 1.18–11.9), a history of fever in the past week (aRR 3.7, 95% CI 1.69–8.05), low CD4 count (aRR 6.9, 95% CI 4.7–10.2) and missing any day of ART (aRR 1·8, 95% CI 1·27–2.57) increased the risk of virologic failure. Being on 2nd line therapy, the site where one receives care and CD4 count < 500 predicted viraemia, persistent viraemia and virologic failure.

Conclusion

In conclusion, these findings demonstrate that HIV-infected patients established on ART for more than six months in the African setting frequently experienced viraemia while continuing to be on ART. The findings also show that being on second line, low CD4 count, missing any day of ART and history of fever in the past week remain important predictors of virologic failure that should trigger intensified adherence counselling especially in the absence of reliable or readily available viral load monitoring. Finally, clinical care sites are different calling for further analyses to elucidate on the unique features of these sites.

]]>
<![CDATA[Health administrative data enrichment using cohort information: Comparative evaluation of methods by simulation and application to real data]]> https://www.researchpad.co/article/5c5ca2efd5eed0c48441edf1

Background

Studies using health administrative databases (HAD) may lead to biased results since information on potential confounders is often missing. Methods that integrate confounder data from cohort studies, such as multivariate imputation by chained equations (MICE) and two-stage calibration (TSC), aim to reduce confounding bias. We provide new insights into their behavior under different deviations from representativeness of the cohort.

Methods

We conducted an extensive simulation study to assess the performance of these two methods under different deviations from representativeness of the cohort. We illustrate these approaches by studying the association between benzodiazepine use and fractures in the elderly using the general sample of French health insurance beneficiaries (EGB) as main database and two French cohorts (Paquid and 3C) as validation samples.

Results

When the cohort was representative from the same population as the HAD, the two methods are unbiased. TSC was more efficient and faster but its variance could be slightly underestimated when confounders were non-Gaussian. If the cohort was a subsample of the HAD (internal validation) with the probability of the subject being included in the cohort depending on both exposure and outcome, MICE was unbiased while TSC was biased. The two methods appeared biased when the inclusion probability in the cohort depended on unobserved confounders.

Conclusion

When choosing the most appropriate method, epidemiologists should consider the origin of the cohort (internal or external validation) as well as the (anticipated or observed) selection biases of the validation sample.

]]>
<![CDATA[Increased risk of rheumatoid arthritis among patients with Mycoplasma pneumonia: A nationwide population-based cohort study in Taiwan]]> https://www.researchpad.co/article/5c466572d5eed0c4845193ac

Objective

An association between Mycoplasma pneumonia (MP) and rheumatoid arthritis (RA) had been reported in animal studies for decades. However, clinical evidence for this association is lacking. Therefore, this study aimed to provide epidemiologic evidence to clarify the relationship between MP and development of RA.

Methods

This 13-year nationwide, population-based, retrospective cohort study analyzed the risk of RA in a cohort of MP patients. We cross linked and compared the database of those with catastrophic illnesses to make sure the diagnoses of RA are correctly labeled. We selected 116,053 hospitalized patients diagnosed with MP between 2000 and 2012 from the Taiwan National Health Insurance Research Database and 464,212 matched controls at a 1:4 ratio by age, gender, and index year, in relation to the risk of developing RA. The follow-up period referred to the initial diagnosis of MP until the date of RA diagnosis, censoring of RA, or 31st December 2013. The Cox proportional hazard model was used to analyze the association between MP and incidence of RA among patients with different potential risks.

Results

The adjusted hazard ratio (HR) for incidental RA in the MP group was 1.37 (95% confidence interval CI = 0.87–2.16), compared to non-MP controls. Stratified analysis revealed that the adjusted HR was 3.05 (95% CI = 1.16–7.99, p = 0.02) in a subgroup of patients over the age of 65.The adjusted HR of RA for the MP group among aged ≦19 years and ≥ 65 years was 3.19 (95% CI = 1.04.9.76) and 4.14 (95% CI = 1.27,13.4) within the first 2 years of follow-up.

Conclusion

This cohort study demonstrated that patients with MP had a higher risk of developing RA, especially in the first 2 years, in those aged younger than 19 and over 65.

]]>
<![CDATA[The value of vital sign trends in predicting and monitoring clinical deterioration: A systematic review]]> https://www.researchpad.co/article/5c478c7ed5eed0c484bd2aa6

Background

Vital signs, i.e. respiratory rate, oxygen saturation, pulse, blood pressure and temperature, are regarded as an essential part of monitoring hospitalized patients. Changes in vital signs prior to clinical deterioration are well documented and early detection of preventable outcomes is key to timely intervention. Despite their role in clinical practice, how to best monitor and interpret them is still unclear.

Objective

To evaluate the ability of vital sign trends to predict clinical deterioration in patients hospitalized with acute illness.

Data Sources

PubMed, Embase, Cochrane Library and CINAHL were searched in December 2017.

Study Selection

Studies examining intermittently monitored vital sign trends in acutely ill adult patients on hospital wards and in emergency departments. Outcomes representing clinical deterioration were of interest.

Data Extraction

Performed separately by two authors using a preformed extraction sheet.

Results

Of 7,366 references screened, only two were eligible for inclusion. Both were retrospective cohort studies without controls. One examined the accuracy of different vital sign trend models using discrete-time survival analysis in 269,999 admissions. One included 44,531 medical admissions examining trend in Vitalpac Early Warning Score weighted vital signs. They stated that vital sign trends increased detection of clinical deterioration. Critical appraisal was performed using evaluation tools. The studies had moderate risk of bias, and a low certainty of evidence. Additionally, four studies examining trends in early warning scores, otherwise eligible for inclusion, were evaluated.

Conclusions

This review illustrates a lack of research in intermittently monitored vital sign trends. The included studies, although heterogeneous and imprecise, indicates an added value of trend analysis. This highlights the need for well-controlled trials to thoroughly assess the research question.

]]>
<![CDATA[Change in skeletal muscle associated with unplanned hospital admissions in adult patients: A systematic review and meta-analysis]]> https://www.researchpad.co/article/5c390bc5d5eed0c48491e3f6

Objectives

The primary objective of the review was to describe change that occurs in skeletal muscle during periods of unplanned hospitalisation in adult patients. The secondary objective was to examine the relationship between both physical activity and inflammation with the change in skeletal muscle. A further objective was to investigate the effect of interventions on change in skeletal muscle during periods of unplanned hospitalisation.

Design

A systematic review and meta-analyses. Embase, MEDLINE, CINAHL, AMED, PEDro and the Cochrane Library were searched for studies that included any measures of skeletal muscle (excluding pulmonary function) at two time points during unplanned hospitalisation. Studies that were set in critical care, or included patients with acute or progressive neurological illness, were excluded.

Results

Our search returned 27,809 unique articles, of which 35 met the inclusion criteria. Meta-analyses of change between baseline and follow-up in random effects models suggested that grip strength had an average increase: standardised mean difference (SMD) = 0.10 (95% CI: 0.03; 0.16); knee extension strength had an average reduction: SMD = -0.24 (95% CI: -0.33; -0.14); and mid-arm muscle circumference had an average reduction: SMD = -0.17 (95% CI: -0.22; -0.11). Inflammation appeared to be associated with greater loss of muscle strength. There was inconclusive evidence that the level of physical activity affects change in skeletal muscle. In regard to the effect of interventions, only exercise interventions were consistently associated with improved skeletal muscle outcomes.

Conclusions

Adult patients who undergo an unplanned hospital admission may experience a small reduction in knee extension strength and mid-arm muscle mass. Prospective research is needed to clarify the contribution of confounding factors underlying the observations made in this review, with particular attention to levels of physical activity, and possible contributions from environmental factors and processes of hospital care.

]]>
<![CDATA[Effect of anterior capsule polish on visual function: A meta-analysis]]> https://www.researchpad.co/article/5c3e4f7ad5eed0c484d75cec

Purpose

To investigate the relationship between anterior capsule polish and visual function.

Methods

Data were obtained from Pubmed, Embase, Web of Science, WanFang, VIP and CNKI up to the end of May 2018, without any date or language restrictions for trials. The modified Jadad scale and the newcastle-ottawa scale were used to assess the quality of included studies. Uncorrected visual acuity (UCVA) and posterior capsule opacification (PCO) were used as outcome variables. Data on anterior capsule polish were pooled using weighted, random-effect meta-analysis.

Results

One randomized controlled trial and 4 observational cohort studies involving 2533 patients were included in the analyses. There was a statistically significant difference of UCVA (OR 1.92, 95% CI 1.41–2.61) between the polish group and the control group, indicating that anterior capsule polish improved UCVA. Further studies with continuous data also suggested that anterior capsule polish was associated with good UCVA (MD 0.11, 95% CI 0.06–0.16). Posterior capsule opacification rate for 1-year or longer follow-up were extracted for 2561 eyes in 3 studies. Posterior capsule opacification rate was lower in the anterior capsule polish group according to summary odds ratio on PCO rate (OR 0.42 95% CI 0.24–0.73).

Conclusions

Anterior capsule polish prevents complication of modern cataract surgery and benefits on visual function in short term follow-up period.

]]>
<![CDATA[Errors as a primary cause of late-life mortality deceleration and plateaus]]> https://www.researchpad.co/article/5c254543d5eed0c48442c3c0

Several organisms, including humans, display a deceleration in mortality rates at advanced ages. This mortality deceleration is sufficiently rapid to allow late-life mortality to plateau in old age in several species, causing the apparent cessation of biological ageing. Here, it is shown that late-life mortality deceleration (LLMD) and late-life plateaus are caused by common demographic errors. Age estimation and cohort blending errors introduced at rates below 1 in 10,000 are sufficient to cause LLMD and plateaus. In humans, observed error rates of birth and death registration predict the magnitude of LLMD. Correction for these sources of demographic error using a mixed linear model eliminates LLMD and late-life mortality plateaus (LLMPs) without recourse to biological or evolutionary models. These results suggest models developed to explain LLMD have been fitted to an error distribution, that ageing does not slow or stop during old age in humans, and that there is a finite limit to human longevity.

]]>
<![CDATA[Age threshold for recommending higher protein intake to prevent age-related muscle weakness: A cross-sectional study in Japan]]> https://www.researchpad.co/article/5c1ab85cd5eed0c484027ba4

Although insufficient dietary protein intake is a known risk factor for age-related muscle weakness, the optimal age at which higher protein intake is required to prevent muscle weakness is yet to be determined. Using a population-based panel survey of community-dwelling people aged 50–75 years, this cross-sectional study aimed to find the age threshold at which a higher protein intake is associated with higher muscle strength. We utilized a dataset from the Japanese Study of Aging and Retirement conducted between 2007 and 2011. Dietary protein intake was estimated using a validated dietary questionnaire and energy-adjusted via density method. Grip strength was measured using a Smedley-type handheld dynamometer. We calculated the marginal effect (and 95% confidence intervals) of protein intake on grip strength with stratification by age using multiple linear regression analyses with robust variance adjusting for potential confounders. There were 9,485 observations from 5,790 participants in the final analysis. Marginal effects of protein intake on grip strength increased with age, and it reached significance and had a positive impact only among men aged ≥75 years and women aged ≥65 years. With an additional 1% energy of protein intake, grip strength was increased by 0.10 kg and 0.19 kg for men and women aged ≥75 years, respectively. Our result indicated the possibility that women needed a high protein intake from a younger age compared with men. Further studies are needed to clarify from when a higher protein intake is recommended to prevent muscle weakness.

]]>
<![CDATA[Variation in severe postpartum hemorrhage management: A national vignette-based study]]> https://www.researchpad.co/article/5c1c0ac6d5eed0c484426a7b

Objectives

To assess variations in management of severe postpartum hemorrhage: 1) between obstetricians in the same situation 2) by the same obstetrician in different situations.

Study design

A link to a vignette-based survey was emailed to obstetricians of 215 maternity units; the questionnaire asked them to report how they would manage the PPH described in 2 previously validated case-vignettes of different scenarios of severe PPH. Vignette 1 described a typical immediate, severe PPH, and vignette 2 a less typical case of severe but gradual PPH. They were constructed in 3 successive steps and included multiple-choice questions proposing several types of clinical practice options at each step. Variations in PPH were assessed in a descriptive analysis; agreement about management and its timing between vignette 1 and vignette 2 was assessed with the Kappa coefficient.

Results

Analysis of complete responses from 119 (43.4%) obstetricians from 53 (24.6%) maternity units showed delayed or inadequate management in both vignettes. While 82.3% and 83.2% of obstetricians (in vignettes 1 and 2, respectively) would administer oxytocin 15 minutes after PPH diagnosis, only 52.9% and 29.4% would alert other team members. Management by obstetricians of the two vignette situations was inconsistent in terms of choice of treatment and timing of almost all treatments.

Conclusion

Case vignettes demonstrated inadequate management as well as variations in management between obstetricians and in different PPH situations. Protocols or procedures are necessary in all maternity units to reduce the variations in practices that may explain a part of the delay in management that leads to PPH-related maternal mortality and morbidity.

]]>
<![CDATA[Hospitalization and post-discharge care in South Africa: A critical event in the continuum of care]]> https://www.researchpad.co/article/5c1c0a84d5eed0c484426284

Objectives

The purpose of this prospective cohort study is to characterize the event of acute hospitalization for people living with and without HIV and describe its impact on the care continuum. This study describes care-seeking behavior prior to an index hospitalization, inpatient HIV testing and diagnosis, discharge instructions, and follow-up care for patients for patients being discharged from a single hospital in South Africa.

Methods

A convenience sample of adult patients was recruited from the medical wards of a tertiary care facility. Baseline information at the time of hospital admission, subsequent diagnoses, and discharge instructions were recorded. Participants were prospectively followed with phone calls for six months after hospital discharge. Descriptive analyses were performed.

Results

A total of 293 participants were enrolled in the study. Just under half (46%) of the participants were known to be living with HIV at the time of hospital admission. Most participants (97%) were given a referral for follow-up care; often that appointment was scheduled within two weeks of discharge (64%). Only 36% of participants returned to care within the first month, 50% returned after at least one month had elapsed, and 14% of participants did not return for any follow up.

Conclusions

Large discrepancies were found between the type of post-discharge follow-up care recommended by providers and what patients were able to achieve. The period of time following hospital discharge represents a key transition in care. Additional research is needed to characterize patients’ risk following hospitalization and to develop patient-centered interventions.

]]>