ResearchPad - hemorrhage https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Is transjugular insertion of a temporary pacemaker a safe and effective approach?]]> https://www.researchpad.co/article/elastic_article_13814 Temporary pacemakers (TPMs) are usually inserted in an emergency situation. However, there are few reports available regarding which route of access is best or what the most preferred approach is currently in tertiary hospitals. This study aimed to compare procedure times, complication rates, and indications for temporary pacing between the transjugular and transfemoral approaches to TPM placement. We analyzed consecutive patients who underwent TPM placement. Indications; procedure times; and rates of complications including localized infection, any bleeding, and pacing wire repositioning rates were analyzed. A total of 732 patients (361 treated via the transjugular approach and 371 treated via the transfemoral approach) were included. Complete atrioventricular block was the most common cause of TPM placement in both groups, but sick sinus syndrome was especially common in the transjugular approach group. Separately, procedure time was significantly shorter in the transjugular approach group (9.0 ± 8.0 minutes vs. 11.9 ± 9.7 minutes; P < 0.001). Overall complication rates were not significantly different between the two groups, and longer duration of temporary pacing was a risk factor for repositioning. The risk of reposition was significantly increased when the temporary pacing was continued more than 5 days and 3 days in the transjugular approach group and the transfemoral approach group, respectively. The transjugular approach should be considered if the TPM is required for more than 3 days.

]]>
<![CDATA[Incidence and determinants of Implanon discontinuation: Findings from a prospective cohort study in three health zones in Kinshasa, DRC]]> https://www.researchpad.co/article/elastic_article_7634 Kinshasa is Africa's third largest city and one of the continent’s most rapidly growing urban areas. PMA2020 data showed that Kinshasa has a modern contraceptive prevalence of 26.5% among married women in 2018. In Kinshasa’s method mix, the contraceptive implant recently became the dominant method among contraceptive users married and in union. This study provides insight into patterns of implant use in a high-fertility setting by evaluating the 24-month continuation rate for Implanon NXT and identifying the characteristics associated with discontinuation.MethodologyThis community-based, prospective cohort study followed 531 Implanon users aged 18–49 years at 6, 12 and 24 months. The following information was collected: socio-demographic characteristics, Method Information Index (MII) and contraceptive history. The main outcome variable for this study was implant discontinuation. The incidence rate of discontinuation is presented as events per 1000 person/months (p-m), from the date of enrolment. The Cox proportional hazards modelling was used to measure predictors of discontinuation.ResultsA total of 9158.13 p-m were available for analysis, with an overall incidence rate of 9.06 (95% CI: 9.04–9.08) removals per 1000 p-m. Of nine possible co-variates tested, the likelihood of discontinuation was higher among women who lived in military camps, had less than three children, never used injectables or implants in the past, had experienced heavy/prolonged bleeding, and whose MII score was less than 3.ConclusionIn addition to four client characteristics that predicted discontinuation, we identified one programmatic factor: quality of counseling as measured by the Method Information Index. Community providers in similar contexts should pay more attention to clients having less than three children, new adopters, and to clients living military camps as underserved population, where clients have less access to health facilities. More targeted counselling and follow-up is needed, especially on bleeding patterns. ]]> <![CDATA[Would you like to participate in this trial? The practice of informed consent in intrapartum research in the last 30 years]]> https://www.researchpad.co/article/Na45ec8a9-d35b-4ecd-a654-0f10371697fd

Background

Informed consent is the cornerstone of the ethical conduct and protection of the rights and wellbeing of participants in clinical research. Therefore, it is important to identify the most appropriate moments for the participants to be informed and to give consent, so that they are able to make a responsible and autonomous decision. However, the optimal timing of consent in clinical research during the intrapartum period remains controversial, and currently, there is no clear guidance.

Objective

We aimed to describe practices of informed consent in intrapartum care clinical research in the last three decades, as reported in uterotonics for postpartum haemorrhage prevention trials.

Methods

This is a secondary analysis of the studies included in the Cochrane review entitled “Uterotonic agents for preventing postpartum haemorrhage: a network meta-analysis” published in 2018. All the reports included in the Cochrane network meta-analysis were eligible for inclusion in this analysis, except for those reported in languages other than English, French or Spanish. We extracted and synthesized data on the time each of the components of the informed consent process occurred.

Results

We assessed data from 192 studies, out of 196 studies included in the Cochrane review. The majority of studies (59.9%, 115 studies) reported that women were informed about the study, without specifying the timing. When reported, most studies informed women at admission to the facility for childbirth. Most of the studies reported that consent was sought, but only 59.9% reported the timing, which in most of the cases, was at admission for childbirth. Among these, 32 studies obtained consent in the active phase of labour, 17 in the latent phase and in 10 studies the labour status was unknown. Women were consented antenatally in 6 studies and in 8 studies the consent was obtained indistinctly during antenatal care or at admission. Most of the studies did not specified who was the person who sought the informed consent.

Conclusion

Practices of informed consent in trials on use of uterotonics for prevention of postpartum haemorrhage showed variability and substandard reporting. Informed consent sought at admission for childbirth was the most frequent approach implemented in these trials.

]]>
<![CDATA[Prevalence of drug–drug interaction in atrial fibrillation patients based on a large claims data]]> https://www.researchpad.co/article/N8861b373-c994-4a77-acd1-a0d16f2f19bb

This study aimed to compare and determine the prevalence of drug–drug interaction (DDI) and bleeding rate in atrial fibrillation (AF) patients receiving anticoagulants in a clinical setting. We used large claims data of AF patients obtained from the Japan Medical Data Center. The prevalence of DDIs and cases leading to bleeding events were surveyed clinically relevant DDIs extracted from 1) reported from a spontaneous adverse event reporting system (Japanese Adverse Drug Events Report system; JADER) ≥4 patients; 2) DDIs cited in the package inserts of each anticoagulant (each combination assessed according to “Drug interaction 2015” list; 3) warfarin and quinolone antibiotics DDIs. DDIs were categorized the mechanisms for pharmacokinetic DDI (Cytochrome P450 (CYP) or transporter etc. that modulate blood concentration of anticoagulants)/pharmacodynamic DDI (combination with similar pharmacological actions) or both in the analysis for each patients’ prescriptions obtained from a claims data. AF patients were compared between cases with and without bleeding after administered of anticoagulants. Bleeding was observed in 220/3290 (6.7%) AF patients. The bleeding rate in patients with both pharmacokinetic and pharmacodynamic DDI mechanisms (26.3%) was higher than that in patients with either mechanism (8.6% and 9.2%, respectively) or without DDIs (4.9%). The odds ratio for bleeding in AF patients with both of pharmacokinetic and pharmacodynamic was (7.18 [4.69–11.00], p<0.001). Our study concluded multi mechanism based DDIs leads serious outcome as compared to that of single mechanism based DDIs in AF patients. We determined the prevalence and frequency of bleeding for anticoagulant-related DDIs. To manage DDIs, both pharmacokinetic and pharmacodynamic DDI mechanisms should be closely monitored for initial symptoms of bleeding within the first 3 months.

]]>
<![CDATA[An anatomical study on lumbar arteries related to the extrapedicular approach applied during lumbar PVP (PKP)]]> https://www.researchpad.co/article/5c8823e7d5eed0c48463929f

To observe the regional anatomy of the lumbar artery (LA) associated with the extrapedicular approach applied during percutaneous vertebroplasty (PVP) and percutaneous kyphoplasty (PKP), we collected 78 samples of abdominal computed tomography angiography imaging data. We measured the nearest distance from the center of the vertebral body puncture point to the LA (distance VBPP-LA, DVBPP-LA). According to the DVBPP-LA, four zones, Zone I, Zone II, Zone III and Zone IV, were identified. LAs that passed through these zones were called Type I, Type II, Type III and Type IV LAs, respectively. A portion of the lumbar vertebrae had an intersegmental branch that originated from the upper segmental LA and extended longitudinally across the lateral wall of the pedicle; it was called Type V LA. Compared with the DVBPP-LA in L1, L2, L3 and L4, the overall difference and between-group differences were significant (P < 0.05). In L1, L2, L3, L4 and L5, there were 8, 4, 4, 0 and 1 Type I LAs, respectively. There were no Type V LAs in L1 and L2, but there were 2, 16 and 26 Type V LAs in L3, L4 and L5, respectively. In L1-L5, the numbers of Type I LA plus Type V LA were 8, 4, 6, 16 and 27, and the presence ratios were 5.1%, 2.6%, 5.6%, 10.3% and 17.3%, respectively. In L4 and L5, the male presence ratios of Type I LA plus Type V LA were 7.1% and 10.7%, respectively, and the female presence ratios were 13.9% and 25.0%, respectively. Thus, extrapedicular PVP (PKP) in lumbar vertebrae had a risk of LA injury and was not suggested for use in L4 and L5, especially in female patients.

]]>
<![CDATA[Socioeconomic gap between neighborhoods of Budapest: Striking impact on stroke and possible explanations]]> https://www.researchpad.co/article/5c76fe19d5eed0c484e5b4a4

Introduction

Hungary has a single payer health insurance system offering free healthcare for acute cerebrovascular disorders. Within the capital, Budapest, however there are considerable microregional socioeconomic differences. We hypothesized that socioeconomic deprivation reflects in less favorable stroke characteristics despite universal access to care.

Methods

From the database of the National Health Insurance Fund, we identified 4779 patients hospitalized between 2002 and 2007 for acute cerebrovascular disease (hereafter ACV, i.e. ischemic stroke, intracerebral hemorrhage, or transient ischemia), among residents of the poorest (District 8, n = 2618) and the wealthiest (District 12, n = 2161) neighborhoods of Budapest. Follow-up was until March 2013.

Results

Mean age at onset of ACV was 70±12 and 74±12 years for District 8 and 12 (p<0.01). Age-standardized incidence was higher in District 8 than in District 12 (680/100,000/year versus 518/100,000/year for ACV and 486/100,000/year versus 259/100,000/year for ischemic stroke). Age-standardized mortality of ACV overall and of ischemic stroke specifically was 157/100,000/year versus 100/100,000/year and 122/100,000/year versus 75/100,000/year for District 8 and 12. Long-term case fatality (at 1,5, and 10 years) for ACV and for ischemic stroke was higher in younger District 8 residents (41–70 years of age at the index event) compared to D12 residents of the same age. This gap between the districts increased with the length of follow-up. Of the risk diseases the prevalence of hypertension and diabetes was higher in District 8 than in District 12 (75% versus 66%, p<0.001; and 26% versus 16%, p<0.001).

Discussion

Despite universal healthcare coverage, the disadvantaged district has higher ACV incidence and mortality than the wealthier neighborhood. This difference affects primarily the younger age groups. Long-term follow-up data suggest that inequity in institutional rehabilitation and home-care should be investigated and improved in disadvantaged neighborhoods.

]]>
<![CDATA[Rivaroxaban administration after acute ischemic stroke: The RELAXED study]]> https://www.researchpad.co/article/5c6dca1cd5eed0c48452a7cf

The efficacy of early anticoagulation in acute stroke with nonvalvular atrial fibrillation (NVAF) remains unclear. We performed a study to evaluate the risk of recurrent ischemic stroke (IS) and major bleeding in acute IS patients with NVAF who started rivaroxaban. This observational study evaluated patients with NVAF and acute IS/transient ischemic attack (TIA) in the middle cerebral arterial territory who started rivaroxaban within 30 days after the index IS/TIA. The primary endpoints were recurrent IS and major bleeding within 90 days after the index IS/TIA. The relationship between the endpoints and the time to start rivaroxaban was evaluated by correlation analysis using cerebral infarct volume, determined by diffusion-weighted magnetic resonance images within 48 hours of onset of the index IS/TIA. Of 1309 patients analyzed, recurrent IS occurred in 30 (2.3%) and major bleeding in 11 (0.8%) patients. Among patients with known infarct size (N = 1207), those with small (<4.0 cm3), medium (≥4.0 and <22.5 cm3), and large (≥22.5 cm3) infarcts started rivaroxaban a median of 2.9, 2.9, and 5.8 days, respectively, after the index IS/TIA. Recurrent IS was significantly less frequent when starting rivaroxaban ≤14 days versus ≥15 days after IS (2.0% versus 6.8%, P = 0.0034). Incidences of recurrent IS and major bleeding in whom rivaroxaban was started <3 days (N = 584) after IS were also low: 1.5% and 0.7%, respectively. Initiation of rivaroxaban administration in acute IS or TIA was associated with a low recurrence of IS (2.3%), and a low incidence of major bleeding events (0.8%) for 90 days after the index stroke. For the prevention of recurrent attacks in acute IS patients with NVAF, it is feasible to start the administration of rivaroxaban within 14 days of onset. Rivaroxaban started within 3 days of onset may be a feasible treatment option for patients with a small or medium-sized infarction.

]]>
<![CDATA["Not taken seriously"—A qualitative interview study of postpartum Rwandan women who have experienced pregnancy-related complications]]> https://www.researchpad.co/article/5c6dc9bdd5eed0c48452a10e

Background

There is limited knowledge on the women’s experiences of pregnancy-related complications in Rwanda. This study aimed to investigate women’s experiences and perceptions of specific complications during pregnancy and delivery and the consequences of these complications on postpartum health and family situation.

Methods

Data were collected through individual in-depth interviews (N = 15). Participants who experienced complications such as postpartum haemorrhage, caesarean section due to prolonged labour/dystocia, pre-eclampsia, or fistula and who were 13–24 months postpartum were invited to participate in the study in July 2015. Interviews were held in Kinyarwanda, digitally recorded, transcribed verbatim, translated into English, and analysed using qualitative content analysis.

Results

Most participants reported that they were previously unaware of the complications they had developed, and they claimed that at discharge they should have been better informed about the potential consequences of these complications. Most participants blamed the health care system as the cause of their problems due to the provision of inadequate care. Participants elaborated different strategies for coping with persistent health problems. Pregnancy-related complications negatively affected participants’ economic situation due to increased health care expenses and lowered income because of impaired working capacity, and participants expressed fear of encountering the same pregnancy-related health problems during future pregnancies.

Conclusions

The findings of this study demonstrate how participants felt that inadequate health care provision during pregnancy, delivery, and the postpartum period was the source of their problems. Participants reported different coping strategies to improve their respective life situation despite persistent health problems. Women’s individual postpartum experiences need to be considered and actions taken at the policy level and also by the local community, in terms of the quality of antenatal and postpartum care services, and in sensitizing the local community about the existence of these complications and preparing the community to support the affected women.

]]>
<![CDATA[Quality of INR control and switching to non-Vitamin K oral anticoagulants between women and men with atrial fibrillation treated with Vitamin K Antagonists in Spain. A population-based, real-world study]]> https://www.researchpad.co/article/5c6c75bad5eed0c4843d0092

Background

Worldwide, there is growing evidence that quality of international normalized ratio (INR) control in atrial fibrillation patients treated with Vitamin K Antagonists (VKA) is suboptimal. However, sex disparities in population-based real-world settings have been scarcely studied, as well as patterns of switching to second-line Non-VKA oral anticoagulants (NOAC). We aimed to assess the quality of INR control in atrial fibrillation patients treated with VKA in the region of Valencia, Spain, for the whole population and differencing by sex, and to identify factors associated with poor control. We also quantified switching to Non-VKA oral anticoagulants (NOAC) and we identified factors associated to switching.

Methods

This is a cross-sectional, population-based study. Information was obtained through linking different regional electronic databases. Outcome measures were Time in Therapeutic Range (TTR) and percentage of INR determinations in range (PINRR) in 2015, and percentage of switching to NOAC in 2016, for the whole population and stratified by sex.

Results

We included 22,629 patients, 50.4% were women. Mean TTR was 62.3% for women and 63.7% for men, and PINNR was 58.3% for women and 60.1% for men (p<0.001). Considering the TTR<65% threshold, 53% of women and 49.3% of men had poor anticoagulation control (p<0.001). Women, long-term users antiplatelet users, and patients with comorbidities, visits to Emergency Department and use of alcohol were more likely to present poor INR control. 5.4% of poorly controlled patients during 2015 switched to a NOAC throughout 2016, with no sex differences.

Conclusion

The quality of INR control of all AF patients treated with VKA in 2015 in our Southern European region was suboptimal, and women were at a higher risk of poor INR control. This reflects sex disparities in care, and programs for improving the quality of oral anticoagulation should incorporate the gender perspective. Clinical inertia may be lying behind the observed low rates of switching in patient with poor INR control.

]]>
<![CDATA[Monitoring quality of obstetric care from hospital discharge databases: A Delphi survey to propose a new set of indicators based on maternal health outcomes]]> https://www.researchpad.co/article/5c6c7589d5eed0c4843cfe6b

Objectives

Most indicators proposed for assessing quality of care in obstetrics are process indicators and do not directly measure health effects, and cannot always be identified from routinely available databases. Our objective was to propose a set of indicators to assess the quality of hospital obstetric care from maternal morbidity outcomes identifiable in permanent hospital discharge databases.

Methods

Various maternal morbidity outcomes potentially reflecting quality of obstetric care were first selected from a systematic literature review. Then a three-round Delphi consensus survey was conducted online from 11/2016 through 02/2017 among a French panel of 37 expert obstetricians, anesthetists-critical-care specialists, midwives, quality-of-care researchers, and user representatives. For a given maternal outcome, several definitions could be proposed and the indicator (i.e. corresponding rate) could be applied to all women or restricted to specific subgroup(s).

Results

Of the 49 experts invited to participate, 37 agreed. The response rate was 92% in the second round and 97% in the third. Finally, a set of 13 indicators was selected to assess the quality of hospital obstetric care: rates of uterine rupture, postpartum hemorrhage, transfusion incident, severe perineal lacerations, episiotomy, cesarean, cesarean under general anesthesia, post-cesarean site infection, anesthesia-related complications, postpartum pulmonary embolism, maternal readmission and maternal mortality. Six were considered in specific subgroups, with, for example, the postpartum hemorrhage rate assessed among all women and also among women at low risk of PPH.

Implications

This Delphi process enabled us to define consensually a set of indicators to assess the quality of hospital obstetrics care from routine hospital data, based on maternal morbidity outcomes. Considering 6 of them in specific subgroups of women is especially interesting. These indicators, identifiable through codes used in international classifications, will be useful to monitor quality of care over time and across settings.

]]>
<![CDATA[Risk factors for small bowel bleeding in an overt gastrointestinal bleeding presentation after negative upper and lower endoscopy]]> https://www.researchpad.co/article/5c76fe6dd5eed0c484e5ba31

Introduction

A small bowel source is suspected when evaluation of overt gastrointestinal (GI) bleeding with upper and lower endoscopy is negative. Video capsule endoscopy (VCE) is the recommended next diagnostic test for small bowel bleeding sources. However, clinical or endoscopic predictive factors for small bowel bleeding in the setting of an overt bleeding presentation are unknown. We aimed to define predictive factors for positive VCE among individuals presenting with overt bleeding and a suspected small bowel source.

Methods

We included consecutive inpatient VCE performed between September 1, 2012 to September 1, 2015 for melena or hematochezia at two tertiary centers. All patients had EGD and colonoscopy performed prior to VCE. Patient demographics, medication use, and endoscopic findings were retrospectively recorded. VCE findings were graded based on the P0-P2 grading system. The primary outcome of interest was a positive (P2) VCE. The secondary outcome of interest was the performance of a therapeutic intervention. Data were analyzed with the Fisher exact test for dichotomous variables and logistic regression.

Results

Two hundred forty-three VCE were reviewed, and 117 were included in the final analysis. A positive VCE (P2) was identified in 35 (29.9%) cases. In univariate analysis, a positive VCE was inversely associated with presence of diverticula on preceding colonoscopy (OR: 0.44, 95% CI: 0.2–0.99), while identification of blood on terminal ileal examination was associated with a positive VCE (OR: 5.18, 95% CI: 1.51–17.76). In multivariate analysis, only blood identified on terminal ileal examination remained a significant risk factor for positive VCE (OR: 6.13, 95% CI: 1.57–23.81). Blood on terminal ileal examination was also predictive of therapeutic intervention in both univariate (OR: 4.46, 95% CI: 1.3–15.2) and multivariate analysis (OR: 5.04, 95% CI: 1.25–20.32).

Conclusion

Among patients presenting with overt bleeding but negative upper and lower endoscopy, the presence of blood on examination of the terminal ileum is strongly associated with a small bowel bleeding source as well as with small bowel therapeutic intervention. Presence of diverticula on colonoscopy is inversely associated with a positive VCE and therapeutic intervention in univariate analysis.

]]>
<![CDATA[Evaluation of hemostasis in patients with end-stage renal disease]]> https://www.researchpad.co/article/5c76fe68d5eed0c484e5b9fa

An increased bleeding risk is reported for patients with end-stage renal disease. This study aims to analyze, whether bleeding risk can be assessed by global tests of hemostasis. Standard laboratory tests and an extended evaluation of hemostasis by rotational thromboelastometry, platelet function analyzer (PFA) and multiple electrode aggregometry as well as thrombin generation assays and measurement of fibrinolytic potential were performed in 20 patients on hemodialysis, 10 patients on peritoneal dialysis, 10 patients with chronic kidney disease stage G5 (CKD5) and in 10 healthy controls (HC). Hemoglobin was significantly lower in patients with end-stage renal disease versus HC (each p<0.01). Patients on peritoneal dialysis showed increased fibrinogen levels compared to HC (p<0.01), which were also reflected by FIBTEM results (each p<0.05). 41% of hemodialysis patients and 44% of CKD5 patients presented with prolonged PFA-ADP-test (p<0.05), while no patient on peritoneal dialysis and no HC offered this modification. Thrombin generating potential was significantly lower in patients on hemodialysis, while clot lysis time revealed a hypofibrinolytic state in patients on hemo- and peritoneal dialysis compared to HC (p<0.001). In conclusion, patients with end-stage renal disease have complex hemostatic changes with both hyper- and hypocoagulable features, which are dependent on use and type of dialysis. Hypercoagulable features include elevated fibrinogen levels and a hypofibrinolytic state, whereas hypocoagulable features include decreased thrombin generating capacity and platelet dysfunction. Our results may contribute to a more rational approach to hemostatic management in these patients.

]]>
<![CDATA[Development of a risk score for predicting the benefit versus harm of extending dual antiplatelet therapy beyond 6 months following percutaneous coronary intervention for stable coronary artery disease]]> https://www.researchpad.co/article/5c6f1498d5eed0c48467a3af

Background

Decisions on dual antiplatelet therapy (DAPT) duration should balance the opposing risks of ischaemia and bleeding. Our aim was to develop a risk score to identify stable coronary artery disease (SCAD) patients undergoing PCI who would benefit or suffer from extending DAPT beyond 6 months.

Methods

Retrospective analysis of a cohort of patients who completed 6 months of DAPT following PCI. Predictors of ischaemic and bleeding events for the 6–12 month period post-PCI were identified and a risk score was developed to estimate the likelihood of benefiting from extending DAPT beyond 6 months. Incidence of mortality, ischaemic and bleeding events for patients treated with DAPT for 6 vs. 6–12 months, was compared, stratified by strata of the risk score.

Results

The study included 2,699 patients. Over 6 months’ follow up, there were 78 (2.9%) ischaemic and 43 (1.6%) bleeding events. Four variables (heart failure, left ventricular ejection fraction ≤30%, left main or three vessel CAD, status post (s/p) PCI and s/p stroke) predicted ischemic events, two variables (age>75, haemoglobin <10 g/dL) predicted bleeding. In the lower stratum of the risk score, 6–12 months of treatment with DAPT resulted in increased bleeding (p = 0.045) with no decrease in ischaemic events. In the upper stratum, 6–12 months DAPT was associated with reduced ischaemic events (p = 0.029), with no increase in bleeding.

Conclusion

In a population of SCAD patients who completed 6 months of DAPT, a risk score for subsequent ischaemic and bleeding events identified patients likely to benefit from continuing or stopping DAPT.

]]>
<![CDATA[Combining liver stiffness with hyaluronic acid provides superior prognostic performance in chronic hepatitis C]]> https://www.researchpad.co/article/5c6b26b4d5eed0c484289ee1

Background

Non-invasive methods are the first choice for liver fibrosis evaluation in chronic liver diseases, but few studies investigate the ability of combined methods to predict outcomes.

Methods

591 chronic hepatitis C patients with baseline liver stiffness (LSM) by FibroScan and hyaluronic acid measurements were identified retrospectively. The patients were grouped by baseline LSM: < 10kPa, 10–16.9kPa, and 17-75kPa. Primary outcomes were all-cause mortality and liver-related mortality, analyzed using cox regression and competing risk regression models, respectively.

Results

Median follow-up was 46.1 months. Prevalence of cirrhosis at baseline was 107/591 (18.1%). Median LSM was 6.8kPa (IQR 5.3–11.6) and divided into groups, 404/591 (68.4%) had a LSM < 10kPa, 100/591 (16.9%) had a LSM between 10–16.9kPa and 87/591 (14.7%) had a LSM between 17-75kPa. There were 69 deaths, 27 from liver-related disease. 26 patients developed cirrhosis and 30 developed complications of cirrhosis. The mortality rate in the 17-75kPa group was 9.7/100 person-years, compared to 2.2/100 person-years and 1.1/100 person-years in the 10–16.9kPa and <10kPa groups (p<0.005). Liver-related mortality increased 10-fold for each group (p<0.005). Cirrhotic complications occurred almost exclusively in the 17-75kPa group, with an incidence of 10.3/100 person-years, compared to 1.8/100 person-years and 0.2/100 person-years in the 10–16.9kPa and <10kPa groups (p<0.005). Median hyaluronic acid in the 17-75kPa group was approximately 200ng/mL. Patients with a LSM 17-75kPa had significantly higher risks of death, liver-related death, and complications to cirrhosis if their hyaluronic acid measurement was more than or equal to 200ng/mL at baseline, with hazard ratios of 3.25 (95% CI 1.48–7.25), 7.7 (95% CI 2.32–28), and 3.2 (95% CI 1.35–7.39), respectively.

Conclusions

The combination of LSM and circulating hyaluronic acid measurements significantly improved prognostic ability, relative to LSM alone. Combined static and dynamic markers of liver fibrosis could provide superior risk prediction.

]]>
<![CDATA[Lapachol and synthetic derivatives: in vitro and in vivo activities against Bothrops snake venoms]]> https://www.researchpad.co/article/5c58d640d5eed0c4840319be

Background

It is known that local tissue injuries incurred by snakebites are quickly instilled causing extensive, irreversible, tissue destruction that may include loss of limb function or even amputation. Such injuries are not completely neutralized by the available antivenins, which in general are focused on halting systemic effects. Therefore it is prudent to investigate the potential antiophidic effects of natural and synthetic compounds, perhaps combining them with serum therapy, to potentially attenuate or eliminate the adverse local and systemic effects of snake venom. This study assessed a group of quinones that are widely distributed in nature and constitute an important class of natural products that exhibit a range of biological activities. Of these quinones, lapachol is one of the most important compounds, having been first isolated in 1882 from the bark of Tabebuia avellanedae.

Methodology/Principal findings

It was investigated the ability of lapachol and some new potential active analogues based on the 2-hydroxi-naphthoquinone scaffold to antagonize important activities of Bothrops venoms (Bothrops atrox and Bothrops jararaca) under different experimental protocols in vitro and in vivo. The bioassays used to test the compounds were: procoagulant, phospholipase A2, collagenase and proteolytic activities in vitro, venom-induced hemorrhage, edematogenic, and myotoxic effects in mice. Proteolytic and collagenase activities of Bothrops atrox venom were shown to be inhibited by lapachol and its analogues 3a, 3b, 3c, 3e. The inhibition of these enzymatic activities might help to explain the effects of the analogue 3a in vivo, which decreased skin hemorrhage induced by Bothrops venom. Lapachol and the synthetic analogues 3a and 3b did not inhibit the myotoxic activity induced by Bothrops atrox venom. The negative protective effect of these compounds against the myotoxicity can be partially explained by their lack of ability to effectively inhibit phospholipase A2 venom activity. Bothrops atrox venom also induced edema, which was significantly reduced by the analogue 3a.

Conclusions

This research using a natural quinone and some related synthetic quinone compounds has shown that they exhibit antivenom activity; especially the compound 3a. The data from 3a showed a decrease in inflammatory venom effects, presumably those that are metalloproteinase-derived. Its ability to counteract such snake venom activities contributes to the search for improving the management of venomous snakebites.

]]>
<![CDATA[A simplified flow cytometric method for detection of inherited platelet disorders—A comparison to the gold standard light transmission aggregometry]]> https://www.researchpad.co/article/5c52184bd5eed0c484797a80

Background

Flow cytometric platelet activation has emerged as an alternative diagnostic test for inherited platelet disorders. It is, however, labor intensive and few studies have directly compared the performance of flow cytometric platelet activation (PACT) to light transmission aggregometry (LTA). The aims of this study were 1/ to develop a simplified flow cytometric platelet activation assay using microtiter plates and 2/ to correlate the outcome to gold standard method LTA, and to clinical bleeding assessment tool scores (BAT score).

Methods

The PACT method was developed in microtiter plates using adenosine diphosphate (ADP), collagen-derived peptide (CRP-XL) and thrombin receptor activator for peptide 6 (TRAP-6) as agonists. Antibodies against GPIIb-IIIa activation epitope (PAC1), P-selectin (CD62P) and lysosome-associated membrane glycoprotein 3 (LAMP3; CD63) were used as platelet activation markers. Sixty-six patients referred to the coagulation unit for bleeding symptoms were included in this single-center observational study. Platelet activation was determined by PACT and LTA. The results of both methods were correlated to BAT score.

Results

A two-by-two analysis using Cohen’s kappa analysis gave moderate agreement between LTA and PACT (82%, kappa = 0.57), when PACT analysis with ADP and CRP-XL was compared to LTA. Using LTA as reference method, positive predictive value was 70% and negative predictive value was 87%. A substantial number of patients had high BAT score and normal LTA and PACT results. Patients with abnormal LTA or PACT results had higher BAT score than patients with normal results, but the difference was not significant.

Conclusions

The performance in microtiter plates simplified the PACT method and enabled analysis of more patients at the same time. Our results indicate that with modification of the current PACT assay, a higher negative predictive value can be obtained. Furthermore, with comparable result to LTA the PACT could be used as a screening assay for inherited platelet disorders.

]]>
<![CDATA[The usefulness of wire-guided endoscopic snare papillectomy for tumors of the major duodenal papilla]]> https://www.researchpad.co/article/5c5217dfd5eed0c484794a8a

Objectives

Although endoscopic papillectomy is useful for treating papillary tumors, it is associated with a high rate of complications including pancreatitis; therefore, safer treatment options are needed. We examined the utility of wire-guided endoscopic papillectomy by comparing the pancreatic duct stenting and pancreatitis rates before and after wire-guided endoscopic papillectomy was introduced at our institution.

Methods

We retrospectively examined the data from 16 consecutive patients who underwent conventional endoscopic papillectomy between November 1995 and July 2005 and the data from 33 patients in whom wire-guided endoscopic papillectomy was first attempted at our institution between August 2005 and April 2017. We compared the pancreatic duct stenting and pancreatitis rates between the two groups.

Results

Of the 33 patients in whom wire-guided endoscopic papillectomy was first attempted, the procedure was completed in 21. Pancreatic duct stenting was possible in 30 of the 33 patients in whom wire-guided endoscopic papillectomy was attempted (91%), and this rate was significantly higher than that before the introduction of wire-guided endoscopic papillectomy (68.8%). The incidence of pancreatitis before the introduction of wire-guided endoscopic papillectomy was 12.5%, but after August 2005, the incidence was reduced by half to 6.1%, which includes those patients in whom wire-guided endoscopic papillectomy could not be completed.

Conclusions

Although wire-guided endoscopic papillectomy cannot be completed in some patients, we believe that this method shows some potential for reducing the total incidence of post-endoscopic papillectomy pancreatitis owing to more successful pancreatic duct stenting.

]]>
<![CDATA[Validation of the extended thrombolysis in cerebral infarction score in a real world cohort]]> https://www.researchpad.co/article/5c40f781d5eed0c4843862a0

Background

A thrombolysis in cerebral infarction (TICI) score of 2b is defined as a good recanalization result although the reperfusion may only cover 50% of the affected territory. An additional mTICI2c category was introduced to further differentiate between mTICI scores. Despite the new mTICI2c category, mTICI2b still covers a range of 50–90% reperfusion which might be too imprecise to predict neurological improvement after therapy.

Aim

To compare the 7-point “expanded TICI” (eTICI) scale with the traditional mTICI in regard to predict functional independence at 90 days.

Methods

Retrospective review of 225 patients with large artery occlusion. Angiograms were graded by 2 readers according the 7-point eTICI score (0% = eTICI0; reduced clot = eTICI1; 1–49% = eTICI2a, 50–66% = eTICI2b50; 67–89% = eTICI2b67, 90–99% = eTICI2c and complete reperfusion = eTICI3) and the conventional mTICI score. The ability of e- and mTICI to predict favorable outcome at 90days was compared.

Results

Given the ROC analysis eTICI was the better predictor of favorable outcome (p-value 0.047). Additionally, eTICI scores 2b50, 2b67 and 2c (former mTICI2b) were significantly superior at predicting the probability of a favorable outcome at 90 days after endovascular therapy with a p-value of 0.033 (probabilities of 17% for mTICI2b50, 24% for mTICI2b67 and 54% for mTICI2c vs. 36% for mTICI2b).

Conclusions

The 7-point eTICI allows for a more accurate outcome prediction compared to the mTICI score because it refines the broad range of former mTICI2b results.

]]>
<![CDATA[Competing risks of major bleeding and thrombotic events with prasugrel-based dual antiplatelet therapy after stent implantation - An observational analysis from BASKET-PROVE II]]> https://www.researchpad.co/article/5c478c6dd5eed0c484bd2457

Background

Dual antiplatelet therapy (DAPT) prevents thrombotic events after coronary stent implantation but may induce bleedings, specifically in elderly patients. However, a competitive risk analysis is lacking.

Objectives

To assess the determinants of major bleeding and the balance between the competing risks of major bleeding and thrombotic events during prasugrel-based DAPT after stent implantation.

Methods

Overall, 2,291 patients randomized to drug-eluting or bare metal stents and treated with prasugrel 10mg/day for 1 year were followed over 2 years for major bleeding (BARC 3/5) and thrombotic events (cardiac death, myocardial infarction, definitive/probable stent thrombosis). Prasugrel dose was reduced to 5mg in patients >75 years and/or <60kg. Predictors of major bleeding and competing risks of major bleeding and thrombotic events were assessed.

Results

Two-year rates of major bleeding and thrombotic events were 2.9% and 9.0%, respectively. The only independent predictor of major bleeding was age (hazard ratio per year increase 1.05 [1.02,1.07], p<0.001). The relationship between major bleeding and age was non-linear, with lowest hazard ratios at 57 years and an exponential increase only above 65 years. In contrast, the relationship between thrombotic events and age was linear and continuously increasing with older age. While the competing risk of thrombotic events was higher than that of major bleeding in younger patients, the two risks were similar in older patients. After discontinuation of prasugrel, bleeding events leveled off in all patients, while thrombotic events continued to increase.

Conclusions

In prasugrel-based DAPT, age is the strongest risk factor for major bleeding, increasing exponentially >65 years. In younger patients, thrombotic events represent a higher risk than bleeding, while thrombotic and bleeding risks were similar in older patients. Important clinical implications relate to prasugrel dose in the elderly, duration of DAPT and the competing risk balance necessitating individualized treatment decisions.

]]>
<![CDATA[Augmentation of curved tip of left-sided double-lumen tubes to reduce right bronchial misplacement: A randomized controlled trial]]> https://www.researchpad.co/article/5c478c5bd5eed0c484bd1d52

Background

During intubation with a blind technique, a left-sided double-lumen tube (DLT) can be misdirected into the right bronchus even though its curved tip of the bronchial lumen turns to the left. This right bronchial misplacement may be associated with the tip angle of DLTs. We thus performed a randomized trial to test the hypothesis that the DLT with an acute tip angle enters the right bronchus less frequently than the tube with an obtuse tip angle.

Methods

We randomized surgical patients (n = 1427) receiving a polyvinyl chloride left-sided DLT. Before intubation, the curved tip was further bent to an angle of 135° and kept with a stylet inside in the curved-tip group, but not in the control group. After the tip was inserted into the glottis under direct or video laryngoscopy, the stylet was removed and the DLT was advanced into the bronchus with its tip turning to the left. We checked which bronchus was intubated, and the time and number of attempts for intubation. After surgery, we assessed airway injury, sore throat, and hoarseness. The primary outcome was the incidence of right bronchial misplacement of the DLT.

Results

DLTs were misdirected into the right bronchus more frequently in the control group than in the curved-tip group: 57/715 (8.0%) vs 17/712 (2.4%), risk ratio (95% CI) 3.3 (2.0–5.7), P < 0.001. The difference was significant in the use of 32 (P = 0.003), 35 (P = 0.007), and 37 (P = 0.012) Fr DLTs. Intubation required longer time (P < 0.001) and more attempts (P = 0.002) in the control group. No differences were found in postoperative airway injury, sore throat and hoarseness.

Conclusions

Before intubation of left-sided DLTs, augmentation of the curved DLT tip reduced the right bronchial misplacement and facilitated intubation without aggravating airway injury.

]]>