ResearchPad - risk-management https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Longitudinal analysis of cost and dental utilization patterns for older adults in outpatient and long-term care settings in Minnesota]]> https://www.researchpad.co/article/elastic_article_14553 Dental utilization patterns and costs of providing comprehensive oral healthcare for older adults in different settings have not been examined.MethodsRetrospective longitudinal cohort data from Apple Tree Dental (ATD) were analyzed (N = 1,159 total; 503 outpatients, 656 long-term care residents) to describe oral health status at presentation, service utilization patterns, and care costs. Generalized estimating equation (GEE) repeated measures analysis identified significant contributors to service cost over the three-year study period.ResultsCohort mean age was 74 years (range = 55–104); the outpatient (OP) group was younger compared to the long-term care (LTC) group. Half (56%) had Medicaid, 22% had other insurance, and 22% self-paid. Most (72%) had functional dentitions (20+ teeth), 15% had impaired dentitions (9–19 teeth), 6% had severe tooth loss (1–8 teeth), and 7% were edentulous (OP = 2%, LTC = 11%). More in the OP group had functional dentition (83% vs. 63% LTC). The number of appointments declined from 5.0 in Year 1 (OP = 5.7, LTC = 4.4) to 3.3 in Year 3 (OP = 3.6, LTC = 3.0). The average cost to provide dental services was $1,375/year for three years (OP = $1,427, LTC = $1,336), and costs declined each year, from an average of $1,959 (OP = $2,068, LTC = $1,876) in Year 1 to $1,016 (OP = $989, LTC = $1,037) by Year 3. Those with functional dentition at presentation were significantly less costly than those with 1–19 teeth, while edentulous patients demonstrated the lowest cost and utilization. Year in treatment, insurance type, dentition type, and problem-focused first exam were significantly associated with year-over-year cost change in both OP and LTC patients.ConclusionCosts for providing comprehensive dental care in OP and LTC settings were similar, modest, and declined over time. Dentate patients with functional dentition and edentulous patients were less costly to treat. LTC patients had lower utilization than OP patients. Care patterns shifted over time to increased preventive care and decreased restorative care visits. ]]> <![CDATA[Why pandemic response is unique: powerful experts and hands-off political leaders]]> https://www.researchpad.co/article/N777a5214-aec1-4675-9510-3b88967dc8c4

Purpose

The purpose of this paper is to show that 2009 H1N1 “swine” influenza pandemic vaccination policies deviated from predictions established in the theory of political survival, and to propose that pandemic response deviated because it was ruled by bureaucratized experts rather than by elected politicians.

Design/methodology/approach

Focussing on the 2009 H1N1 pandemic, the paper employs descriptive statistical analysis of vaccination policies in nine western democracies. To probe the plausibility of the novel explanation, it uses quantitative and qualitative content analyses of media attention and coverage in two deviant cases, the USA and Denmark.

Findings

Theories linking political survival to disaster responses find little empirical support in the substantial cross-country variations of vaccination responses during the 2009 H1N1 pandemic. Rather than following a political logic, the case studies of media coverage in the USA and Denmark demonstrate that the response was bureaucratized in the public health agencies (CDC and DMHA, respectively). Hence, while natural disaster responses appear to follow a political logic, the response to pandemics appears to be more strongly instituted in the hands of bureaucratic experts.

Research limitations/implications

There is an added value of encompassing bureaucratic dynamics in political theories of disaster response; bureaucratized expertise proved to constitute a strong plausible explanation of the 2009 pandemic vaccination response.

Practical implications

Pandemic preparedness and response depends critically on understanding the lessons of the 2009 H1N1 pandemic; a key lesson supported by this paper is that expert-based agencies rather than political leaders are the pivotal actors.

Originality/value

This paper is the first to pinpoint the limitations of political survival theories of disaster responses with respect to the 2009 pandemic. Further, it is among the few to analyze the causes of variations in cross-country pandemic vaccination policies during the 2009 H1N1 pandemic.

]]>
<![CDATA[The “wicked problems” of governing UK health security disaster prevention]]> https://www.researchpad.co/article/Nb3e15b1a-82be-4f74-92da-aab246f1f583

Purpose

The purpose of this paper is to examine the governance and policy-making challenges in the context of “wicked problems” based on the case of pandemic influenza.

Design/methodology/approach

The case study research is based on an analysis of official documentation and interviews with policy elites at multiple levels of UK governance.

Findings

Results of this study show that policy actors regard risk communication, the dynamics of international public policy and UK territorial governance as the main governance challenges in the management of influenza at a macro-level. The paper also serves to identify that although contingencies management for epidemiological issues require technical and scientific considerations to feature in governance arrangements, equally there are key “wicked problems” in the context public policy that pervade the health security sector.

Practical implications

The study indicates the need to build in resources at a national level to plan for policy coordination challenges in areas that might at first be seen as devoid of political machinations (such as technical areas of public policy that might be underpinned by epidemiological processes). The identification of the major governance challenges that emerge from the pandemic influenza case study is a springboard for a research agenda in relation to the analysis of the parallels and paradoxes of governance challenges for health security across EU member states.

Originality/value

This paper provides a novel interrogation of the pandemic influenza case study in the context of UK governance and public policy by providing a strategic policy lens from perspective of elites.

]]>
<![CDATA[Distinguishing moral hazard from access for high-cost healthcare under insurance]]> https://www.researchpad.co/article/N9aa1c21e-eb0c-47d9-9336-743c9eef5b98

Context

Health policy has long been preoccupied with the problem that health insurance stimulates spending (“moral hazard”). However, much health spending is costly healthcare that uninsured individuals could not otherwise access. Field studies comparing those with more or less insurance cannot disaggregate moral hazard versus access. Moreover, studies of patients consuming routine low-dollar healthcare are not informative for the high-dollar healthcare that drives most of aggregate healthcare spending in the United States.

Methods

We test indemnities as an alternative theory-driven counterfactual. Such conditional cash transfers would maintain an opportunity cost for patients, unlike standard insurance, but also guarantee access to the care. Since indemnities do not exist in U.S. healthcare, we fielded two blinded vignette-based survey experiments with 3,000 respondents, randomized to eight clinical vignettes and three insurance types. Our replication uses a population that is weighted to national demographics on three dimensions.

Findings

Most or all of the spending due to insurance would occur even under an indemnity. The waste attributable to moral hazard is undetectable.

Conclusions

For high-cost care, policymakers should be more concerned about the foregone efficient spending for those lacking full insurance, rather than the wasteful spending that occurs with full insurance.

]]>
<![CDATA[Use of non-insulin diabetes medicines after insulin initiation: A retrospective cohort study]]> https://www.researchpad.co/article/5c6dc9a1d5eed0c484529f41

Background

Clinical guidelines recommend that metformin be continued after insulin is initiated among patients with type 2 diabetes, yet little is known regarding how often metformin or other non-insulin diabetes medications are continued in this setting.

Methods

We conducted a retrospective cohort study to characterize rates and use patterns of six classes of non-insulin diabetes medications: biguanides (metformin), sulfonylureas, thiazolidinediones (TZDs), glucagon-like peptide 1 receptor agonists (GLP1 receptor agonists), dipeptidyl peptidase 4 inhibitors (DPP4 inhibitors), and sodium-glucose co-transporter inhibitors (SGLT2 inhibitors), among patients with type 2 diabetes initiating insulin. We used the 2010–2015 MarketScan Commercial Claims and Encounters data examining 72,971 patients with type 2 diabetes aged 18–65 years old who initiated insulin and had filled a prescription for a non-insulin diabetes medication in the 90 days prior to insulin initiation. Our primary outcome was the proportion of patients refilling the various non-insulin diabetes medications during the first 90 days after insulin initiation. We also used time-to-event analysis to characterize the time to discontinuation of specific medication classes.

Results

Metformin was the most common non-insulin medication used prior to insulin initiation (N = 53,017, 72.7%), followed by sulfonylureas (N = 25,439, 34.9%) and DPP4 inhibitors (N = 8,540, 11.7%). More than four out of five patients (N = 65,902, 84.7%) refilled prescriptions for any non-insulin diabetes medications within 90 days after insulin initiation. Within that period, metformin remained the most common medication with the highest continuation rate of 84.6%, followed by SGLT2 inhibitors (81.9%) and TZDs (79.3%). Sulfonylureas were the least likely medications to be continued (73.6% continuation) though they remained the second most common medication class used after insulin initiation. The median time to discontinuation varied by therapeutic class from the longest time to discontinuation of 26.4 months among metformin users to the shortest (3.0 months) among SGLT2 inhibitor users.

Conclusion

While metformin was commonly continued among commercially insured adults starting insulin, rates of continuation of other non-insulin diabetes medications were also high. Further studies are needed to determine the comparative effectiveness and safety of continuing insulin secretagogues and newer diabetes medications after insulin initiation.

]]>
<![CDATA[Role of insurance in determining utilization of healthcare and financial risk protection in India]]> https://www.researchpad.co/article/5c633943d5eed0c484ae6374

Background

Universal health coverage has become a policy goal in most developing economies. We assess the association of health insurance (HI) schemes in general, and RSBY (National Health Insurance Scheme) in particular, on extent and pattern of healthcare utilization. Secondly, we assess the relationship of HI and RSBY on out-of-pocket (OOP) expenditures and financial risk protection (FRP).

Methods

A cross-sectional study was undertaken to interview 62335 individuals among 12,134 households in 8 districts of three states in India i.e. Gujarat, Haryana and Uttar Pradesh (UP). Data on socio-demographic characteristics, assets, education, occupation, consumption expenditure, illness in last 15 days or hospitalization during last 365 days, treatment sought and its OOP expenditure was collected. We computed catastrophic health expenditures (CHE) as indicator for FRP. Hospitalization rate, choice of care provider and CHE were regressed to assess their association with insurance status and type of insurance scheme, after adjusting for other covariates.

Results

Mean OOP expenditures for outpatient care among insured and uninsured were INR 961 (USD 16) and INR 840 (USD 14); and INR 32573 (USD 543) and INR 24788 (USD 413) for an episode of hospitalization respectively. The prevalence of CHE for hospitalization was 28% and 26% among the insured and uninsured population respectively. No significant association was observed in multivariate analysis between hospitalization rate, choice of care provider or CHE with insurance status or RSBY in particular.

Conclusion

Health insurance in its present form does not seem to provide requisite improvement in access to care or financial risk protection.

]]>
<![CDATA[A conditional model predicting the 10-year annual extra mortality risk compared to the general population: a large population-based study in Dutch breast cancer patients]]> https://www.researchpad.co/article/5c536c65d5eed0c484a49ea5

Objective

Many cancer survivors are facing difficulties in getting a life insurance; raised premiums and declinatures are common. We generated a prediction model estimating the conditional extra mortality risk of breast cancer patients in the Netherlands. This model can be used by life insurers to accurately estimate the additional risk of an individual patient, conditional on the years survived.

Methodology

All women diagnosed with stage I-III breast cancer in 2005–2006, treated with surgery, were selected from the Netherlands Cancer Registry. For all stages separately, multivariable logistic regression was used to estimate annual mortality risks, conditional on the years survived, until 10 years after diagnosis, resulting in 30 models. The conditional extra mortality risk was calculated by subtracting mortality rates of the general Dutch population from the patient mortality rates, matched by age, gender and year. The final model was internally and externally validated, and tested by life insurers.

Results

We included 23,234 patients: 10,101 stage I, 9,868 stage II and 3,265 stage III. The final models included age, tumor stage, nodal stage, lateralization, location within the breast, grade, multifocality, hormonal receptor status, HER2 status, type of surgery, axillary lymph node dissection, radiotherapy, (neo)adjuvant systemic therapy and targeted therapy. All models showed good calibration and discrimination. Testing of the model by life insurers showed that insurability using the newly-developed model increased with 13%, ranging from 0%-24% among subgroups.

Conclusion

The final model provides accurate conditional extra mortality risks of breast cancer patients, which can be used by life insurers to make more reliable calculations. The model is expected to increase breast cancer patients’ insurability and transparency among life insurers.

]]>
<![CDATA[Readmission risk and costs of firearm injuries in the United States, 2010-2015]]> https://www.researchpad.co/article/5c536c37d5eed0c484a49bcc

Background

In 2015 there were 36,252 firearm-related deaths and 84,997 nonfatal injuries in the United States. The longitudinal burden of these injuries through readmissions is currently underestimated. We aimed to determine the 6-month readmission risk and hospital costs for patients injured by firearms.

Methods

We used the Nationwide Readmission Database 2010–2015 to assess the frequency of readmissions at 6 months, and hospital costs associated with readmissions for patients with firearm-related injuries. We produced nationally representative estimates of readmission risks and costs.

Results

Of patients discharged following a firearm injury, 15.6% were readmitted within 6 months. The average annual cost of inpatient hospitalizations for firearm injury was over $911 million, 9.5% of which was due to readmissions. Medicare and Medicaid covered 45.2% of total costs for the 5 years, and uninsured patients were responsible for 20.1%.

Conclusions

From 2010–2015, the average total cost of hospitalization for firearm injuries per patient was $32,700, almost 10% of which was due to readmissions within 6 months. Government insurance programs and the uninsured shouldered most of this.

]]>
<![CDATA[Survey of suspected dysphagia prevalence in home-dwelling older people using the 10-Item Eating Assessment Tool (EAT-10)]]> https://www.researchpad.co/article/5c5217d3d5eed0c48479462b

Objective

This study was carried out to determine the prevalence of suspected dysphagia and its features in both independent and dependent older people living at home.

Materials and methods

The 10-Item Eating Assessment Tool (EAT-10) questionnaire was sent to 1,000 independent older people and 2,000 dependent older people living at home in a municipal district of Tokyo, Japan. The participants were selected by stratified randomization according to age and care level. We set the cut-off value of EAT-10 at a score of ≥3. The percentage of participants with an EAT-10 score ≥3 was defined as the prevalence of suspected dysphagia. The chi-square test was used for analyzing prevalence in each group. Analysis of the distribution of EAT-10 scores, and comparisons among items, age groups, and care levels to identify symptom features were performed using the Kruskal-Wallis test and Mann-Whitney U test.

Results

Valid responses were received from 510 independent older people aged 65 years or older (mean age 75.0 ± 7.2) and 886 dependent older people (mean age 82.3 ± 6.7). The prevalences of suspected dysphagia were 25.1% and 53.8%, respectively, and showed significant increases with advancing age and care level. In both groups, many older people assigned high scores to the item about coughing, whereas individuals requiring high-level care assigned higher scores to the items about not only coughing but also swallowing of solids and quality of life.

Conclusion

In independent people, approximately one in four individuals showed suspected dysphagia and coughing was the most perceivable symptom. In dependent people, approximately one in two individuals showed suspected dysphagia and their specifically perceivable symptoms were coughing, difficulties in swallowing solids and psychological burden.

]]>
<![CDATA[Can diabetes patients seeking a second hospital get better care? Results from nested case–control study]]> https://www.researchpad.co/article/5c50c495d5eed0c4845e8986

This study investigates the effects of the number of medical institutions visited on risk of death. This study conducted a nested case-control design using the National Health Insurance Service–Senior database from 2002 to 2013. Cases were defined as those with death among outpatients who had first diagnosis of diabetes mellitus (E10-E14) after entry into the base cohort and controls were selected by incidence density sampling and matched to cases based on age, and sex. Our main results were presented by conditional logistic regression for nested case-controls design. Of total 55,558 final study samples, there were 9,313 (16.8%) cases and 46,245 (83.2%) controls. With an increase by one point in the number of hospitals per medical utilization, risk of death significantly increased by 4.1% (odds ratio (OR): 1.041, 95% confidence interval [CI]: 1.039–1.043). In both medical utilization and number of hospitals, those with high medical utilization (OR: 1.065, 95% CI: 1.059–1.070) and number of hospitals (OR: 1.049, 95% CI: 1.041–1.058) for risk of death were significantly higher than those with low medical utilization (OR: 1.040, 95% CI: 1.037–1.043) and number of hospitals (OR: 1.029, 95% CI: 1.027–1.032), respectively. The number of medical institution visited was significantly associated with risk of death. Therefore, diabetics should be warned about the potential of risk of death incurred from excessive access to medical utilizations.

]]>
<![CDATA[Attachment strength and on-farm die-off rate of Escherichia coli on watermelon surfaces]]> https://www.researchpad.co/article/5c3e4f8fd5eed0c484d76beb

Pre-harvest contamination of produce has been a major food safety focus. Insight into the behavior of enteric pathogens on produce in pre-harvest conditions will aid in developing pre-harvest and post-harvest risk management strategies. In this study, the attachment strength (SR) and die-off rate of E. coli on the surface of watermelon fruits and the efficacy of aqueous chlorine treatment against strongly attached E. coli population were investigated. Watermelon seedlings were transplanted into eighteen plots. Prior to harvesting, a cocktail of generic E. coli (ATCC 23716, 25922 and 11775) was inoculated on the surface of the watermelon fruits (n = 162) and the attachment strength (SR) values and the daily die-off rates were examined up to 6 days by attachment assay. After 120 h, watermelon samples were treated with aqueous chlorine (150 ppm free chlorine for 3 min). The SR value of the E. coli cells on watermelon surfaces significantly increased (P<0.05) from 0.04 to 0.99 in the first 24 h, which was primarily due to the decrease in loosely attached population, given that the population of strongly attached cells was constant. Thereafter, there was no significant change in SR values, up to 120 h. The daily die-off rate of E. coli ranged from -0.12 to 1.3 log CFU/cm2. The chlorine treatment reduced the E. coli level by 4.2 log CFU/cm2 (initial level 5.6 log CFU/cm2) and 0.62 log CFU/cm2 (initial level 1.8 log CFU/cm2), on the watermelons that had an attachment time of 30 min and 120 h respectively. Overall, our findings revealed that the population of E. coli on watermelon surfaces declined over time in an agricultural environment. Microbial contamination during pre-harvest stages may promote the formation of strongly attached cells on the produce surfaces, which could influence the efficacy of post-harvest washing and sanitation techniques.

]]>
<![CDATA[Real-world management of patients with epidermal growth factor receptor (EGFR) mutation-positive non–small-cell lung cancer in the USA]]> https://www.researchpad.co/article/5c390bfcd5eed0c48491f430

Background

Randomized phase III trials have established the efficacy of epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors as first-line treatment for EGFR mutation-positive advanced non–small-cell lung cancer (EGFR Mut+ NSCLC). This retrospective cohort study examined the management patterns and outcomes of patients with EGFR Mut+ NSCLC in a real-world setting.

Materials and methods

Data were extracted from the US Flatiron Electronic Health Record-derived database. Adult patients with stage IIIB/IV EGFR Mut+ NSCLC (exon 19 deletion or exon 21 L858R mutation) who had received first-line systemic therapy between 2011 and 2016 were included. Demographic and clinical characteristics were analyzed. Outcomes evaluated were time to next treatment (a surrogate for progression-free survival) and overall survival.

Results

Of the 22,258 patients with advanced NSCLC in the database, 961 met the inclusion criteria. Median age was 69.0 years (range: 61–78) and the majority were female (68.0%), with stage IV (93.9%), non-squamous cell carcinoma (97.4%). EGFR tyrosine kinase inhibitors were the most widely prescribed first-line therapy (72.8%). The likelihood of receiving an EGFR tyrosine kinase inhibitor or chemotherapy was unaffected by the type of medical insurance patients had. Patients treated with an EGFR tyrosine kinase inhibitor had significantly longer time to next treatment than those given other first-line systemic therapies (p < 0.0001). There were no significant differences in overall survival according to treatment type.

Conclusion

Results from this large US cohort study reflect those obtained in randomized trials of patients with advanced EGFR Mut+ NSCLC and demonstrate their transferability into a real-world setting.

]]>
<![CDATA[Predictive value of a nomogram for hepatocellular carcinoma with brain metastasis at initial diagnosis: A population-based study]]> https://www.researchpad.co/article/5c3667bbd5eed0c4841a6228

Background

Population-based estimates of the incidence and prognosis of brain metastases at diagnosis of hepatocellular carcinoma (HCC) are lacking. The aim of this study was to characterize the incidence proportion and survival of newly diagnosed hepatocellular carcinoma with brain metastases (HCCBM).

Materials and methods

Data from Surveillance, Epidemiology, and End Results (SEER) program between 2010 and 2014 was evaluated. Patients with HCCBM were included. Multivariable logistic and Cox regression were performed to identify predictors of the presence of brain metastases at diagnosis and prognostic factors of overall survival (OS). We also built a nomogram based on Cox model to predict prognosis for HCCBM patients.

Results

We identified 97 patients with brain metastases at the time of diagnosis of HCC, representing 0.33% of the entire cohort. Logistic regression showed patients with bone or lung metastases had greater odds of having brain metastases at diagnosis. Median OS for HCCBM was 2.40 months. Cox regression revealed unmarried and bone metastases patients suffered significantly shorter survival time. A nomogram was developed with internal validation concordance index of 0.639.

Conclusions

This study provided population-based estimates of the incidence and prognosis for HCCBM patients. The nomogram could be a convenient individualized predictive tool for prognosis.

]]>
<![CDATA[The Impact of Adding a Physician Assistant to a Critical Care Outreach Team]]> https://www.researchpad.co/article/5989da5fab0ee8fa60b90ae1

Rationale

Hospitals are increasingly using critical care outreach teams (CCOTs) to respond to patients deteriorating outside intensive care units (ICUs). CCOT staffing is variable across hospitals and optimal team composition is unknown.

Objectives

To assess whether adding a critical care medicine trained physician assistant (CCM-PA) to a critical care outreach team (CCOT) impacts clinical and process outcomes.

Methods

We performed a retrospective study of two cohorts—one with a CCM-PA added to the CCOT (intervention hospital) and one with no staffing change (control hospital)—at two facilities in the same system. All adults in the emergency department and hospital for whom CCOT consultation was requested from October 1, 2012-March 16, 2013 (pre-intervention) and January 5-March 31, 2014 (post-intervention) were included. We performed difference-in-differences analyses comparing pre- to post-intervention periods in the intervention versus control hospitals to assess the impact of adding the CCM-PA to the CCOT.

Measurements and Main Results

Our cohort consisted of 3,099 patients (control hospital: 792 pre- and 595 post-intervention; intervention hospital: 1114 pre- and 839 post-intervention). Intervention hospital patients tended to be younger, with fewer comorbidities, but with similar severity of acute illness. Across both periods, hospital mortality (p = 0.26) and hospital length of stay (p = 0.64) for the intervention vs control hospitals were similar, but time-to-transfer to the ICU was longer for the intervention hospital (13.3–17.0 vs 11.5–11.6 hours, p = 0.006). Using the difference-in-differences approach, we found a 19.2% reduction (95 confidence interval: 6.7%-31.6%, p = 0.002) in the time-to-transfer to the ICU associated with adding the CCM-PA to the CCOT; we found no difference in hospital mortality (p = 0.20) or length of stay (p = 0.52).

Conclusions

Adding a CCM-PA to the CCOT was associated with a notable reduction in time-to-transfer to the ICU; hospital mortality and length of stay were not impacted.

]]>
<![CDATA[Qualities of Life of Patients with Psychotic Disorders and Their Family Caregivers: Comparison between Hospitalised and Community-Based Treatment in Beijing, China]]> https://www.researchpad.co/article/5989da12ab0ee8fa60b79d53

Background

Community healthcare in mainland China is still at an early stage. The qualities of life (QOLs) of patients with psychotic disorders undergoing rehabilitation in hospitals or in the community, as well as those of their caregivers, may differ from each other.

Objectives

The study was performed to evaluate the QOL of patients with psychotic disorders and assess the differences in the QOLs between patients receiving care in diverse settings (hospital vs. the community).

Methods

This study was a descriptive study, in which all cases were collected from two psychiatric hospitals and five communities. Patients (n = 43) and caregivers (n = 40) in the psychiatric hospitals were grouped according to the length of illness and areas of residence and these criteria were also used to group patients (n = 55) and caregivers (n = 59) in the community. All participants were assessed using the WHOQOL-BREF (Chinese version). ANOVA was adopted to compare the QOL scores among the four groups (cases and caregivers in two settings), while confounding factors, such as age and marital status, were adjusted.

Results

Among the four groups of participants, namely, hospitalised and community patients and their corresponding caregivers, community samples had a significantly lower QOL score. The QOL score for the social relationships domain of the hospitalised patients’ caregivers was significantly higher than that of the caregivers of community patients (P = 0.019).

Conclusion

Community patients and their caregivers tend to have lower QOL scores than their hospitalised counterparts. The support of family members is urgently needed to provide better care for patients.

]]>
<![CDATA[Social, structural, behavioral and clinical factors influencing retention in Pre-Exposure Prophylaxis (PrEP) care in Mississippi]]> https://www.researchpad.co/article/5989db4fab0ee8fa60bdba25

Pre-exposure prophylaxis (PrEP) is a biomedical intervention that can reduce rates of HIV transmission when taken once daily by HIV-negative individuals. Little is understood about PrEP uptake and retention in care among the populations most heavily impacted by the HIV epidemic, particularly among young men who have sex with men (YMSM) in the Deep South. Therefore, this study explored the structural, social, behavioral, and clinical factors that affect PrEP use and retention in care among YMSM in Jackson, Mississippi. Thirty MSM who were prescribed PrEP at an outpatient primary care clinic were interviewed and included 23 men who had been retained in PrEP care and seven who had not been retained. The mean age of participants was 26.6 years. Most (23) participants were African American. Major factors affecting PrEP use and retention in PrEP care included 1) structural factors such as cost and access to financial assistance for medications and clinical services; 2) social factors such as stigma and relationship status; 3) behavioral factors including sexual risk behaviors; and 4) clinical factors such as perceived and actual side effects. Many participants also discussed the positive spillover effects of PrEP use and reported that PrEP had a positive impact on their health. Four of the seven individuals who had not been retained re-enrolled in PrEP care after completing their interviews, suggesting that case management and ongoing outreach can enhance retention in PrEP care. Interventions to enhance retention in PrEP care among MSM in the Deep South will be most effective if they address the complex structural, social, clinical, and behavioral factors that influence PrEP uptake and retention in PrEP care.

]]>
<![CDATA[Patient Satisfaction with Hospital Inpatient Care: Effects of Trust, Medical Insurance and Perceived Quality of Care]]> https://www.researchpad.co/article/5989da16ab0ee8fa60b7b73a

Objective

Deteriorations in the patient-provider relationship in China have attracted increasing attention in the international community. This study aims to explore the role of trust in patient satisfaction with hospital inpatient care, and how patient-provider trust is shaped from the perspectives of both patients and providers.

Methods

We adopted a mixed methods approach comprising a multivariate logistic regression model using secondary data (1200 people with inpatient experiences over the past year) from the fifth National Health Service Survey (NHSS, 2013) in Heilongjiang Province to determine the associations between patient satisfaction and trust, financial burden and perceived quality of care, followed by in-depth interviews with 62 conveniently selected key informants (27 from health and 35 from non-health sectors). A thematic analysis established a conceptual framework to explain deteriorating patient-provider relationships.

Findings

About 24% of respondents reported being dissatisfied with hospital inpatient care. The logistic regression model indicated that patient satisfaction was positively associated with higher level of trust (OR = 14.995), lower levels of hospital medical expenditure (OR = 5.736–1.829 as compared with the highest quintile of hospital expenditure), good staff attitude (OR = 3.155) as well as good ward environment (OR = 2.361). But patient satisfaction was negatively associated with medical insurance for urban residents and other insurance status (OR = 0.215–0.357 as compared with medical insurance for urban employees). The qualitative analysis showed that patient trust—the most significant predictor of patient satisfaction—is shaped by perceived high quality of service delivery, empathic and caring interpersonal interactions, and a better designed medical insurance that provides stronger financial protection and enables more equitable access to health care.

Conclusion

At the core of high levels of patient dissatisfaction with hospital care is the lack of trust. The current health care system reform in China has yet to address the fundamental problems embedded in the system that caused distrust. A singular focus on doctor-patient inter-personal interactions will not offer a successful solution to the deteriorated patient-provider relationships unless a systems approach to accountability is put into place involving all stakeholders.

]]>
<![CDATA[An Analysis of the Number of Medical Malpractice Claims and Their Amounts]]> https://www.researchpad.co/article/5989d9fbab0ee8fa60b72130

Starting from an extensive database, pooling 9 years of data from the top three insurance brokers in Italy, and containing 38125 reported claims due to alleged cases of medical malpractice, we use an inhomogeneous Poisson process to model the number of medical malpractice claims in Italy. The intensity of the process is allowed to vary over time, and it depends on a set of covariates, like the size of the hospital, the medical department and the complexity of the medical operations performed. We choose the combination medical department by hospital as the unit of analysis. Together with the number of claims, we also model the associated amounts paid by insurance companies, using a two-stage regression model. In particular, we use logistic regression for the probability that a claim is closed with a zero payment, whereas, conditionally on the fact that an amount is strictly positive, we make use of lognormal regression to model it as a function of several covariates. The model produces estimates and forecasts that are relevant to both insurance companies and hospitals, for quality assurance, service improvement and cost reduction.

]]>
<![CDATA[The Association between Regional Environmental Factors and Road Trauma Rates: A Geospatial Analysis of 10 Years of Road Traffic Crashes in British Columbia, Canada]]> https://www.researchpad.co/article/5989dadfab0ee8fa60bbb497

Background

British Columbia, Canada is a geographically large jurisdiction with varied environmental and socio-cultural contexts. This cross-sectional study examined variation in motor vehicle crash rates across 100 police patrols to investigate the association of crashes with key explanatory factors.

Methods

Eleven crash outcomes (total crashes, injury crashes, fatal crashes, speed related fatal crashes, total fatalities, single-vehicle night-time crashes, rear-end collisions, and collisions involving heavy vehicles, pedestrians, cyclists, or motorcyclists) were identified from police collision reports and insurance claims and mapped to police patrols. Six potential explanatory factors (intensity of traffic law enforcement, speed limits, climate, remoteness, socio-economic factors, and alcohol consumption) were also mapped to police patrols. We then studied the association between crashes and explanatory factors using negative binomial models with crash count per patrol as the response variable and explanatory factors as covariates.

Results

Between 2003 and 2012 there were 1,434,239 insurance claim collisions, 386,326 police reported crashes, and 3,404 fatal crashes. Across police patrols, there was marked variation in per capita crash rate and in potential explanatory factors. Several factors were associated with crash rates. Percent roads with speed limits ≤ 60 km/hr was positively associated with total crashes, injury crashes, rear end collisions, and collisions involving pedestrians, cyclists, and heavy vehicles; and negatively associated with single vehicle night-time crashes, fatal crashes, fatal speeding crashes, and total fatalities. Higher winter temperature was associated with lower rates of overall collisions, single vehicle night-time collisions, collisions involving heavy vehicles, and total fatalities. Lower socio-economic status was associated with higher rates of injury collisions, pedestrian collisions, fatal speeding collisions, and fatal collisions. Regions with dedicated traffic officers had fewer fatal crashes and fewer fatal speed related crashes but more rear end crashes and more crashes involving cyclists or pedestrians. The number of traffic citations per 1000 drivers was positively associated with total crashes, fatal crashes, total fatalities, fatal speeding crashes, injury crashes, single vehicle night-time crashes, and heavy vehicle crashes. Possible explanations for these associations are discussed.

Conclusions

There is wide variation in per capita rates of motor vehicle crashes across BC police patrols. Some variation is explained by factors such as climate, road type, remoteness, socioeconomic variables, and enforcement intensity. The ability of explanatory factors to predict crash rates would be improved if considered with local traffic volume by all travel modes.

]]>
<![CDATA[Worry experienced during the 2015 Middle East Respiratory Syndrome (MERS) pandemic in Korea]]> https://www.researchpad.co/article/5989db50ab0ee8fa60bdbe61

Background

Korea failed in its risk communication during the early stage of the Middle East Respiratory Syndrome (MERS) outbreak; consequently, it faced difficulties in managing MERS, while disease-related worry increased. Disease-related worry can help disease prevention and management, but can also have a detrimental effect. This study measured the overall level of disease-related worry during the MERS outbreak period in Korea and the influencing factors and levels of disease-related worry during key outbreak periods.

Methods

The cross-sectional survey included 1,000 adults who resided in Korea. An ordinal logistic regression was performed for the overall level of MERS-related worry, and influencing factors of worry were analyzed. A reliability test was performed on the levels of MERS-related worry during key outbreak periods.

Results

The overall level of MERS-related worry was 2.44. Multivariate analysis revealed that women and respondents w very poor subjective health status had higher levels of worry. Respondents with very high stress in daily life had higher levels of worry than those who reported having little stress. The reliability test results on MERS-related worry scores during key outbreak periods showed consistent scores during each period.

Conclusion

Level of worry increased in cases having higher perceived susceptibility and greater trust in informal information, while initial stage of outbreak was closely associated with that at later stages. These findings suggest the importance of managing the level of worry by providing timely and accurate disease-related information during the initial stage of disease outbreak.

]]>