ResearchPad - clinical-research-design https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Safety of tunneled central venous catheters in pediatric hematopoietic stem cell recipients with severe primary immunodeficiency diseases]]> https://www.researchpad.co/article/elastic_article_14693 Tunneled central venous catheters (TCVCs) provide prolonged intravenous access for pediatric patients with severe primary immunodeficiency disease (PID) undergoing hematopoietic stem cell transplantation (HSCT). However, little is known about the epidemiology and clinical significance of TCVC-related morbidity in this particular patient group. We conducted the retrospective analysis of patients with severe PID who received percutaneous landmark-guided TCVC implantation prior to HSCT. We analyzed 92 consecutive TCVC implantations in 69 patients (median [interquartile range] age 3.0 [0–11] years) with severe combined immune deficiency (n = 39, 42.4%), chronic granulomatous disease (n = 17, 18.4%), and other rare PID syndromes (n = 36, 39.2%). The median length of TCVC observation was 144.1 (85.5–194.6) days with a total of 14,040 catheter days at risk (cdr). The overall rate of adverse events during catheter insertion was 17.4% (n = 16) and 25.0% during catheter dwell period (n = 23, catheter risk [CR] per 1000 cdr = 1.64). The most common complication was TCVC-related infection with an overall prevalence of 9.8% (n = 9, CR = 0.64), followed by late dislocation (n = 6, 6.5%, CR = 0.43), early dislocation (n = 4, 4.3%) and catheter dysfunction (n = 4, 4.3%, CR = 0.28). TCVCs are safe in children with severe PID undergoing HSCT with relatively low rates of TCVC-related infection.

]]>
<![CDATA[<i>In silico</i> analyses identify lncRNAs: WDFY3-AS2, BDNF-AS and AFAP1-AS1 as potential prognostic factors for patients with triple-negative breast tumors]]> https://www.researchpad.co/article/elastic_article_13870 Long non-coding RNAs (lncRNAs) are characterized as having 200 nucleotides or more and not coding any protein, and several been identified as differentially expressed in several human malignancies, including breast cancer.MethodsHere, we evaluated lncRNAs differentially expressed in triple-negative breast cancer (TNBC) from a cDNA microarray data set obtained in a previous study from our group. Using in silico analyses in combination with a review of the current literature, we identify three lncRNAs as potential prognostic factors for TNBC patients.ResultsWe found that the expression of WDFY3-AS2, BDNF-AS, and AFAP1-AS1 was associated with poor survival in patients with TNBCs. WDFY3-AS2 and BDNF-AS are lncRNAs known to play an important role in tumor suppression of different types of cancer, while AFAP1-AS1 exerts oncogenic activity.ConclusionOur findings provided evidence that WDFY3-AS2, BDNF-AS, and AFAP1-AS1 may be potential prognostic factors in TNBC development. ]]> <![CDATA[Efficacy of adjuvant chemotherapy with S-1 in stage II oral squamous cell carcinoma patients: A comparative study using the propensity score matching method]]> https://www.researchpad.co/article/N83ad1f15-cdbb-4f4c-8d9c-388a45a97cce

It has been reported that 20% of early-stage oral squamous cell carcinoma (OSCC) patients treated with surgery alone (SA) may exhibit postoperative relapse within 2–3 years and have poor prognoses. We aimed to determine the safety of S-1 adjuvant chemotherapy and the potential differences in the disease-free survival (DFS) between patients with T2N0 (stage II) OSCC treated with S-1 adjuvant therapy (S-1) and those treated with SA. This single-center retrospective cohort study was conducted at Kumamoto University, between April 2004 and March 2012, and included 95 patients with stage II OSCC. The overall cohort (OC), and propensity score-matched cohort (PSMC) were analyzed. In the OC, 71 and 24 patients received SA and S-1, respectively. The time to relapse (TTR), DFS, and overall survival were better in the S-1 group, but the difference was not significant. In the PSMC, 20 patients each received SA and S-1. The TTR was significantly lower in the S-1 group than in the SA group, while the DFS was significantly improved in the former. S-1 adjuvant chemotherapy may be more effective than SA in early-stage OSCC.

]]>
<![CDATA[Ivermectin as an adjuvant to anti-epileptic treatment in persons with onchocerciasis-associated epilepsy: A randomized proof-of-concept clinical trial]]> https://www.researchpad.co/article/N2a703e18-6320-408f-bd4d-1f677396d877

Introduction

Recent findings from onchocerciasis-endemic foci uphold that increasing ivermectin coverage reduces the epilepsy incidence, and anecdotal evidence suggests seizure frequency reduction in persons with onchocerciasis-associated epilepsy, when treated with ivermectin. We conducted a randomized clinical trial to assess whether ivermectin treatment decreases seizure frequency.

Methods

A proof-of-concept randomized clinical trial was conducted in the Logo health zone in the Ituri province, Democratic Republic of Congo, to compare seizure frequencies in onchocerciasis-infected persons with epilepsy (PWE) randomized to one of two treatment arms: the anti-epileptic drug phenobarbital supplemented with ivermectin, versus phenobarbital alone. The primary endpoint was defined as the probability of being seizure-free at month 4. A secondary endpoint was defined as >50% reduction in seizure frequency at month 4, compared to baseline. Both endpoints were analyzed using multiple logistic regression. In longitudinal analysis, the probability of seizure freedom during the follow-up period was assessed for both treatment arms by fitting a logistic regression model using generalized estimating equations (GEE).

Results

Ninety PWE enrolled between October and November 2017 were eligible for analysis. A multiple logistic regression analysis showed a borderline association between ivermectin treatment and being seizure-free at month 4 (OR: 1.652, 95% CI 0.975–2.799; p = 0.062). There was no significant difference in the probability of experiencing >50% reduction of the seizure frequency at month 4 between the two treatment arms. Also, treatment with ivermectin did not significantly increase the odds of being seizure-free during the individual follow-up visits.

Conclusion

Whether ivermectin has an added value in reducing the frequency of seizures in PWE treated with AED remains to be determined. A larger study in persons with OAE on a stable AED regimen and in persons with recent epilepsy onset should be considered to further investigate the potential beneficial effect of ivermectin treatment in persons with OAE.

Trial registration

Registration: www.clinicaltrials.gov; NCT03052998.

]]>
<![CDATA[Acute kidney injury and adverse renal events in patients receiving SGLT2-inhibitors: A systematic review and meta-analysis]]> https://www.researchpad.co/article/N25fa93a0-706a-4469-9476-c6f8ced4ff6a

Background

Sodium-glucose cotransporter-2 inhibitors (SGLT2is) represent a new class of oral hypoglycemic agents used in the treatment of type 2 diabetes mellitus. They have a positive effect on the progression of chronic kidney disease, but there is a concern that they might cause acute kidney injury (AKI).

Methods and findings

We conducted a systematic review and meta-analysis of the effect of SGLT2is on renal adverse events (AEs) in randomized controlled trials and controlled observational studies. PubMed, EMBASE, Cochrane library, and ClinicalTrials.gov were searched without date restriction until 27 September 2019. Data extraction was performed using a standardized data form, and any discrepancies were resolved by consensus. One hundred and twelve randomized trials (n = 96,722) and 4 observational studies with 5 cohorts (n = 83,934) with a minimum follow-up of 12 weeks that provided information on at least 1 adverse renal outcome (AKI, combined renal AE, or hypovolemia-related events) were included. In 30 trials, 410 serious AEs due to AKI were reported. SGLT2is reduced the odds of suffering AKI by 36% (odds ratio [OR] 0.64 [95% confidence interval (CI) 0.53–0.78], p < 0.001). A total of 1,089 AKI events of any severity (AEs and serious AEs [SAEs]) were published in 41 trials (OR 0.75 [95% CI 0.66–0.84], p < 0.001). Empagliflozin, dapagliflozin, and canagliflozin had a comparable benefit on the SAE and AE rate. AEs related to hypovolemia were more commonly reported in SGLT2i-treated patients (OR 1.20 [95% CI 1.10–1.31], p < 0.001). In the observational studies, 777 AKI events were reported. The odds of suffering AKI were reduced in patients receiving SGLT2is (OR 0.40 [95% CI 0.33–0.48], p < 0.001). Limitations of this study are the reliance on nonadjudicated safety endpoints, discrepant inclusion criteria and baseline hypoglycemic therapy between studies, inconsistent definitions of renal AEs and hypovolemia, varying follow-up times in different studies, and a lack of information on the severity of AKI (stages I–III).

Conclusions

SGLT2is reduced the odds of suffering AKI with and without hospitalization in randomized trials and the real-world setting, despite the fact that more AEs related to hypovolemia are reported.

]]>
<![CDATA[Prevalence of drug–drug interaction in atrial fibrillation patients based on a large claims data]]> https://www.researchpad.co/article/N8861b373-c994-4a77-acd1-a0d16f2f19bb

This study aimed to compare and determine the prevalence of drug–drug interaction (DDI) and bleeding rate in atrial fibrillation (AF) patients receiving anticoagulants in a clinical setting. We used large claims data of AF patients obtained from the Japan Medical Data Center. The prevalence of DDIs and cases leading to bleeding events were surveyed clinically relevant DDIs extracted from 1) reported from a spontaneous adverse event reporting system (Japanese Adverse Drug Events Report system; JADER) ≥4 patients; 2) DDIs cited in the package inserts of each anticoagulant (each combination assessed according to “Drug interaction 2015” list; 3) warfarin and quinolone antibiotics DDIs. DDIs were categorized the mechanisms for pharmacokinetic DDI (Cytochrome P450 (CYP) or transporter etc. that modulate blood concentration of anticoagulants)/pharmacodynamic DDI (combination with similar pharmacological actions) or both in the analysis for each patients’ prescriptions obtained from a claims data. AF patients were compared between cases with and without bleeding after administered of anticoagulants. Bleeding was observed in 220/3290 (6.7%) AF patients. The bleeding rate in patients with both pharmacokinetic and pharmacodynamic DDI mechanisms (26.3%) was higher than that in patients with either mechanism (8.6% and 9.2%, respectively) or without DDIs (4.9%). The odds ratio for bleeding in AF patients with both of pharmacokinetic and pharmacodynamic was (7.18 [4.69–11.00], p<0.001). Our study concluded multi mechanism based DDIs leads serious outcome as compared to that of single mechanism based DDIs in AF patients. We determined the prevalence and frequency of bleeding for anticoagulant-related DDIs. To manage DDIs, both pharmacokinetic and pharmacodynamic DDI mechanisms should be closely monitored for initial symptoms of bleeding within the first 3 months.

]]>
<![CDATA[Optimizing predictive performance of criminal recidivism models using registration data with binary and survival outcomes]]> https://www.researchpad.co/article/5c8c193ed5eed0c484b4d25f

In a recidivism prediction context, there is no consensus on which modeling strategy should be followed for obtaining an optimal prediction model. In previous papers, a range of statistical and machine learning techniques were benchmarked on recidivism data with a binary outcome. However, two important tree ensemble methods, namely gradient boosting and random forests were not extensively evaluated. In this paper, we further explore the modeling potential of these techniques in the binary outcome criminal prediction context. Additionally, we explore the predictive potential of classical statistical and machine learning methods for censored time-to-event data. A range of statistical manually specified statistical and (semi-)automatic machine learning models is fitted on Dutch recidivism data, both for the binary outcome case and censored outcome case. To enhance generalizability of results, the same models are applied to two historical American data sets, the North Carolina prison data. For all datasets, (semi-) automatic modeling in the binary case seems to provide no improvement over an appropriately manually specified traditional statistical model. There is however evidence of slightly improved performance of gradient boosting in survival data. Results on the reconviction data from two sources suggest that both statistical and machine learning should be tried out for obtaining an optimal model. Even if a flexible black-box model does not improve upon the predictions of a manually specified model, it can serve as a test whether important interactions are missing or other misspecification of the model are present and can thus provide more security in the modeling process.

]]>
<![CDATA[Single-center retrospective study of the effectiveness and toxicity of the oral iron chelating drugs deferiprone and deferasirox]]> https://www.researchpad.co/article/5c9900fdd5eed0c484b95e7f

Background

Iron overload, resulting from blood transfusions in patients with chronic anemias, has historically been controlled with regular deferoxamine, but its parenteral requirement encouraged studies of orally-active agents, including deferasirox and deferiprone. Deferasirox, licensed by the US Food and Drug Administration in 2005 based upon the results of randomized controlled trials, is now first-line therapy worldwide. In contrast, early investigator-initiated trials of deferiprone were prematurely terminated after investigators raised safety concerns. The FDA declined market approval of deferiprone; years later, it licensed the drug as “last resort” therapy, to be prescribed only if first-line drugs had failed. We undertook to evaluate the long-term effectiveness and toxicities of deferiprone and deferasirox in one transfusion clinic.

Methods and findings

Under an IRB-approved study, we retrospectively inspected the electronic medical records of consented iron-loaded patients managed between 2009 and 2015 at The University Health Network (UHN), Toronto. We compared changes in liver and heart iron, adverse effects and other outcomes, in patients treated with deferiprone or deferasirox.

Results

Although deferiprone was unlicensed in Canada, one-third (n = 41) of locally-transfused patients had been switched from first-line, licensed therapies (deferoxamine or deferasirox) to regimens of unlicensed deferiprone. The primary endpoint of monitoring in iron overload, hepatic iron concentration (HIC), increased (worsened) during deferiprone monotherapy (mean 10±2–18±2 mg/g; p < 0.0003), exceeding the threshold for life-threatening complications (15 mg iron/g liver) in 50% patients. During deferasirox monotherapy, mean HIC decreased (improved) (11±1–6±1 mg/g; p < 0.0001). Follow-up HICs were significantly different following deferiprone and deferasirox monotherapies (p < 0.0000002). Addition of low-dose deferoxamine (<40 mg/kg/day) to deferiprone did not result in reductions of HIC to <15 mg/g (baseline 20±4 mg/g; follow-up, 18±4 mg/g; p < 0.2) or in reduction in the proportion of patients with HIC exceeding 15 mg/g (p < 0.2). During deferiprone exposure, new diabetes mellitus, a recognized consequence of inadequate iron control, was diagnosed in 17% patients, most of whom had sustained HICs exceeding 15 mg/g for years; one woman died after 13 months of a regimen of deferiprone and low-dose deferasirox. During deferiprone exposure, serum ALT increased over baseline in 65% patients. Mean serum ALT increased 6.6-fold (p < 0.001) often persisting for years. During deferasirox exposure, mean ALT was unchanged (p < 0.84). No significant differences between treatment groups were observed in the proportions of patients estimated to have elevated cardiac iron.

Conclusions

Deferiprone showed ineffectiveness and significant toxicity in most patients. Combination with low doses of first-line therapies did not improve the effectiveness of deferiprone. Exposure to deferiprone, over six years while the drug was unlicensed, in the face of ineffectiveness and serious toxicities, demands review of the standards of local medical practice. The limited scope of regulatory approval of deferiprone, worldwide, should restrict its exposure to the few patients genuinely unable to tolerate the two effective, first-line therapies.

]]>
<![CDATA[Epidemiological situation of yaws in the Americas: A systematic review in the context of a regional elimination goal]]> https://www.researchpad.co/article/5c7d95e4d5eed0c484734ee2

Background

Yaws is targeted for eradication by 2020 in the WHA66.12 resolution of the World Health Assembly. The objective of this study was to describe the occurrence of yaws in the Americas and to contribute to the compilation of evidence based on published data to undertake the certification of yaws eradication.

Methodology

A systematic review of the epidemiological situation of yaws in the Americas was performed by searching in MEDLINE, Embase, LILACS, SCOPUS, Web of Science, DARE and Cochrane Database of Systematic Reviews. Experts on the topic were consulted, and institutional WHO/PAHO library databases were reviewed.

Principal findings

Seventy-five full-text articles published between 1839 and 2012 met the inclusion criteria. Haiti and Jamaica were the two countries with the highest number of papers (14.7% and 12.0%, respectively). Three-quarters of the studies were conducted before 1970. Thirty-three countries reported yaws case count or prevalence data. The largest foci in the history were described in Brazil and Haiti. The most recent cases reported were recorded in eight countries: Suriname, Guyana, Colombia, Haiti, Martinique, Dominica, Trinidad and Tobago, and Brazil. Gaps in information and heterogeneity were detected in the methodologies used and outcome reporting, making cross-national and chronological comparisons difficult.

Conclusions

The lack of recent yaws publications may reflect, in the best-case scenario, the interruption of yaws transmission. It should be possible to reach the eradication goal in the region of the Americas, but it is necessary to collect more information. We suggest updating the epidemiological status of yaws, especially in two countries that need to assess ongoing transmission. Twenty-four countries need to demonstrate the interruption of transmission and declare its status of yaws endemicity, and sixteen countries should declare if they are yaws-free. It is necessary to formally verify the achievement of this goal in Ecuador.

]]>
<![CDATA[Use of non-insulin diabetes medicines after insulin initiation: A retrospective cohort study]]> https://www.researchpad.co/article/5c6dc9a1d5eed0c484529f41

Background

Clinical guidelines recommend that metformin be continued after insulin is initiated among patients with type 2 diabetes, yet little is known regarding how often metformin or other non-insulin diabetes medications are continued in this setting.

Methods

We conducted a retrospective cohort study to characterize rates and use patterns of six classes of non-insulin diabetes medications: biguanides (metformin), sulfonylureas, thiazolidinediones (TZDs), glucagon-like peptide 1 receptor agonists (GLP1 receptor agonists), dipeptidyl peptidase 4 inhibitors (DPP4 inhibitors), and sodium-glucose co-transporter inhibitors (SGLT2 inhibitors), among patients with type 2 diabetes initiating insulin. We used the 2010–2015 MarketScan Commercial Claims and Encounters data examining 72,971 patients with type 2 diabetes aged 18–65 years old who initiated insulin and had filled a prescription for a non-insulin diabetes medication in the 90 days prior to insulin initiation. Our primary outcome was the proportion of patients refilling the various non-insulin diabetes medications during the first 90 days after insulin initiation. We also used time-to-event analysis to characterize the time to discontinuation of specific medication classes.

Results

Metformin was the most common non-insulin medication used prior to insulin initiation (N = 53,017, 72.7%), followed by sulfonylureas (N = 25,439, 34.9%) and DPP4 inhibitors (N = 8,540, 11.7%). More than four out of five patients (N = 65,902, 84.7%) refilled prescriptions for any non-insulin diabetes medications within 90 days after insulin initiation. Within that period, metformin remained the most common medication with the highest continuation rate of 84.6%, followed by SGLT2 inhibitors (81.9%) and TZDs (79.3%). Sulfonylureas were the least likely medications to be continued (73.6% continuation) though they remained the second most common medication class used after insulin initiation. The median time to discontinuation varied by therapeutic class from the longest time to discontinuation of 26.4 months among metformin users to the shortest (3.0 months) among SGLT2 inhibitor users.

Conclusion

While metformin was commonly continued among commercially insured adults starting insulin, rates of continuation of other non-insulin diabetes medications were also high. Further studies are needed to determine the comparative effectiveness and safety of continuing insulin secretagogues and newer diabetes medications after insulin initiation.

]]>
<![CDATA[Toxoplasmic retinochoroiditis: The influence of age, number of retinochoroidal lesions and genetic polymorphism for IFN-γ +874 T/A as risk factors for recurrence in a survival analysis]]> https://www.researchpad.co/article/5c6c756dd5eed0c4843cfd80

Purpose

To analyze risk factors for recurrent toxoplasmic retinochoroiditis.

Design

Single center prospective case series.

Population and Methods

A total of 230 patients with toxoplasmic retinochoroiditis were prospectively followed to assess recurrences. All patients were treated with a specific drug regime for toxoplasmosis in each episode of active retinochoroiditis. Individuals with chronic diseases and pregnant women were excluded. Survival analysis by extended Cox regression model (Prentice-Williams-Peterson counting process model) was performed to evaluate the time between recurrences according to some potential risk factors: age, number of retinochoroidal lesions at initial evaluation, sex and interferon gamma +874 T/A gene polymorphism. Hazard Ratios (HR) and 95% confidence intervals (CI) were provided to interpret the risk effects.

Results

One hundred sixty-two recurrence episodes were observed in 104 (45.2%) patients during follow-up that lasted from 269 to 1976 days. Mean age at presentation was 32.8 years (Standard deviation = 11.38). The risk of recurrence during follow up was influenced by age (HR = 1.02, 95% CI = 1.01–1.04) and number of retinochoroidal lesions at the beginning of the study (HR = 1.60, 95% CI = 1.07–2.40). Heterozygosis for IFN-γ gene polymorphism at position +874 T/A was also associated with recurrence (HR = 1.49, 95% CI = 1.04–2.14).

Conclusion

The risk of ocular toxoplasmosis recurrence after an active episode increased with age and was significantly higher in individuals with primary lesions, which suggests that individuals with this characteristic and the elderly could benefit from recurrence prophylactic strategies with antimicrobials. Results suggest an association between IFN-γ gene polymorphism at position +874T/A and recurrence.

]]>
<![CDATA[Differential completeness of spontaneous adverse event reports among hospitals/clinics, pharmacies, consumers, and pharmaceutical companies in South Korea]]> https://www.researchpad.co/article/5c6f14f7d5eed0c48467abbe

The differential pattern and characteristics of completeness in adverse event (AE) reports generated by hospitals/clinics, pharmacies, consumer and pharmaceutical companies remain unknown. Thus, we identified the characteristics of complete AE reports, compared with those of incomplete AE reports, using a completeness score. We used Korea Institute of Drug Safety and Risk Management-Korea Adverse Event Reporting System Database (KIDS-KD) between January 1, 2016 and December 31, 2016. The completeness score was determined out of a total of 100 points, based on the presence of information on temporal relationships, age and sex of patients, AE progress, name of reported medication, reporting group by profession, causality assessment, and informational text. AE reports were organized into four groups based on affiliation: hospitals/clinics, pharmacies, consumers, and pharmaceutical companies. Affiliations that had median completeness scores greater than 80 points were classified as ‘well-documented’ and these reports were further analyzed by logistic regression to estimate the adjusted odds ratios and 95% confidence intervals. We examined 228,848 individual reports and 735,745 drug-AE combinations. The median values of the completeness scores were the highest for hospitals/clinics (95 points), followed by those for consumers (85), pharmacies (75), and manufacturers (72). Reports with causality assessment of ‘certain’, ‘probable’, or ‘possible’ were more likely to be ‘well-documented’ than reports that had causality assessments of ‘unlikely’. Serious reports of AEs were positively associated with ‘well-documented’ reports and negatively associated with hospitals/clinics.

]]>
<![CDATA[Real-life experience of lusutrombopag for cirrhotic patients with low platelet counts being prepared for invasive procedures]]> https://www.researchpad.co/article/5c70675dd5eed0c4847c6f1c

Background and aims

The present study aimed to report our real-life experience of the TPO receptor agonist lusutrombopag for cirrhotic patients with low platelet counts.

Methods

We studied platelet counts in 1,760 cirrhotic patients undergoing invasive procedures at our hospital between January 2014 and December 2017. In addition, we studied 25 patients who were administered lusutrombopag before invasive procedures between June 2017 and January 2018. Effectiveness of lusutrombopag to raise platelet counts and to avoid transfusion and treatment-related adverse events were analyzed.

Results

In 1,760 cirrhotic patients without lusutrombopag prior to invasive procedures, proportion of patients whose platelet counts <50,000/μL and needed platelet transfusions were 66% (n = 27/41) for radiofrequency ablation, 43% (n = 21/49) for transarterial chemoembolization, and 55% (n = 21/38) for endoscopic injection sclerotherapy / endoscopic variceal ligation, respectively. In 25 cirrhotic patients treated by lusutrombopag prior to the invasive procedures, platelet counts significantly increased compared with baseline (82,000 ± 26,000 vs. 41,000 ± 11,000/μL) (p < 0.01). Out of 25 patients, only 4 patients (16%) needed platelet transfusion before the invasive procedures. The proportion of patients with low platelet count and who needed platelet transfusions was significantly low in patients treated with lusutrombopag compared to those not treated with lusutrombopag (16% (4/25) vs. 54% (69/128), p = 0.001). Platelet counts after lusutrombopag treatment and prior to invasive procedures were lower in patients with a baseline platelet count ≤30,000/μL (n = 8) compared with those with a baseline platelet count >30,000/μL (n = 17) (50,000 ± 20,000 vs 86,000 ± 26,000/μL, p = 0.002). Patients with a baseline platelet count ≤30,000/μL with spleen index (calculated by multiplying the transverse diameter by the vertical diameter measured by ultrasonography) ≥40 cm2 (n = 3) had a lower response rate to lusutrombopag compared to those with spleen index <40 cm2 (n = 5) (0% vs. 100%, p = 0.02). Hemorrhagic complication was not observed. Recurrence of portal thrombosis was observed and thrombolysis therapy was required in one patient who had prior history of thrombosis.

Conclusions

Lusutrombopag is an effective and safe drug for thrombocytopenia in cirrhotic patients, and can reduce the frequency of platelet transfusions.

]]>
<![CDATA[Thoracic spine manipulation for the management of mechanical neck pain: A systematic review and meta-analysis]]> https://www.researchpad.co/article/5c6dc9ced5eed0c48452a202

Objective

To investigate the role of thoracic spine manipulation (TSM) on pain and disability in the management of mechanical neck pain (MNP).

Data sources

Electronic databases PubMed, CINAHL, Pedro, Embase, AMED, the Cochrane Library, and clinicaltrials.gov were searched in January 2018.

Study selection

Eligible studies were completed RCTs, written in English, had at least 2 groups with one group receiving TSM, had at least one measure of pain or disability, and included patients with MNP of any duration. The search identified 1717 potential articles, with 14 studies meeting inclusion criteria.

Study appraisal and synthesis methods

Methodological quality was evaluated independently by two authors using the guidelines published by the Cochrane Collaboration. Pooled analyses were analyzed using a random-effects model with inverse variance methods to calculate mean differences (MD) and 95% confidence intervals for pain (VAS 0-100mm, NPRS 0-10pts; 0 = no pain) and disability (NDI and NPQ 0–100%; 0 = no disability).

Results

Across the included studies, there was increased risk of bias for inadequate provider and participant blinding. The GRADE approach demonstrated an overall level of evidence ranging from very low to moderate. Meta-analysis that compared TSM to thoracic or cervical mobilization revealed a significant effect favoring the TSM group for pain (MD -13.63; 95% CI: -21.79, -5.46) and disability (MD -9.93; 95% CI: -14.38, -5.48). Meta-analysis that compared TSM to standard care revealed a significant effect favoring the TSM group for pain (MD -13.21; 95% CI: -21.87, -4.55) and disability (MD -11.36; 95% CI: -18.93, -3.78) at short-term follow-up, and a significant effect for disability (MD -4.75; 95% CI: -6.54, -2.95) at long-term follow-up. Meta-analysis that compared TSM to cervical spine manipulation revealed a non-significant effect (MD 3.43; 95% CI: -7.26, 14.11) for pain without a distinction between immediate and short-term follow-up.

Limitations

The greatest limitation in this systematic review was the heterogeneity among the studies making it difficult to assess the true clinical benefit, as well as the overall level of quality of evidence.

Conclusions

TSM has been shown to be more beneficial than thoracic mobilization, cervical mobilization, and standard care in the short-term, but no better than cervical manipulation or placebo thoracic spine manipulation to improve pain and disability.

Trial registration

PROSPERO CRD42017068287

]]>
<![CDATA[Six-month follow up of a randomized clinical trial-phase I study in Indonesian adults and children: Safety and immunogenicity of Salmonella typhi polysaccharide-diphtheria toxoid (Vi-DT) conjugate vaccine]]> https://www.researchpad.co/article/5c6dc9b4d5eed0c48452a04e

Introduction

There is a high global incidence of typhoid fever, with an annual mortality rate of 200,000 deaths. Typhoid fever also affects younger children, particularly in resource-limited settings in endemic countries. Typhoid vaccination is an important prevention tool against typhoid fever. However, the available polysaccharide typhoid vaccines are not recommended for children under 2 years of age. A new typhoid conjugate Vi-diphtheria toxoid (Vi-DT) vaccine has been developed for infant immunization. We aimed to define the safety and immunogenicity of the Vi-DT vaccine among adults and children in Indonesia.

Methods

An observational, blinded, comparative, randomized, phase I safety study in two age de-escalating cohorts was conducted in East Jakarta, Indonesia, from April 2017 to February 2018. We enrolled 100 healthy subjects in 2 age groups: adults and children (18–40 and 2–5 years old). These groups were randomized into study groups (Vi-DT vaccine), and comparator groups (Vi-polysaccharide (Vi-PS) vaccine and another additional vaccine) which was administered in 4 weeks apart. Subjects were followed up to six months.

Result

One hundred healthy adults and children subjects completed the study. The Vi-DT and Vi-PS vaccines showed no difference in terms of intensity of any immediate local and systemic events within 30 minutes post-vaccination. Overall, pain was the most common local reaction, and muscle pain was the most common systemic reaction in the first 72 hours. No serious adverse events were deemed related to vaccine administration. The first and second doses of the Vi-DT vaccine induced seroconversion and higher geometric mean titers (GMT) in all subjects compared to that of baseline. However, in terms of GMT, the second dose of Vi-DT did not induce a booster response.

Conclusion

The Vi-DT vaccine is safe and immunogenic in adults and children older than two years. A single dose of the vaccine is able to produce seroconversion and high GMT in all individuals.

]]>
<![CDATA[A comparison of the Muenster, SIOP Boston, Brock, Chang and CTCAEv4.03 ototoxicity grading scales applied to 3,799 audiograms of childhood cancer patients treated with platinum-based chemotherapy]]> https://www.researchpad.co/article/5c6f1527d5eed0c48467ae60

Childhood cancer patients treated with platinums often develop hearing loss and the degree is classified according to different scales globally. Our objective was to compare concordance between five well-known ototoxicity scales used for childhood cancer patients. Audiometric test results (n = 654) were evaluated longitudinally and graded according Brock, Chang, International Society of Pediatric Oncology (SIOP) Boston, Muenster scales and the U.S. National Cancer Institute Common Technology Criteria for Adverse Events (CTCAE) version 4.03. Adverse effects of grade 2, 3 and 4 are considered to reflect a degree of hearing loss sufficient to interfere with day-to-day communication (> = Chang grade 2a; > = Muenster grade 2b). We term this “deleterious hearing loss”. A total number of 3,799 audiograms were evaluated. The prevalence of deleterious hearing loss according to the last available audiogram of each patient was 59.3% (388/654) according to Muenster, 48.2% (315/653) according to SIOP, 40.5% (265/652) according to Brock, 40.3% (263/652) according to Chang, and 57.5% (300/522) according to CTCAEv4.03. Overall concordance between the scales ranged from ĸ = 0.636 (Muenster vs. Chang) to ĸ = 0.975 (Brock vs. Chang). Muenster detected hearing loss the earliest in time, followed by Chang, SIOP and Brock. Generally good concordance between the scales was observed but there is still diversity in definitions of functional outcomes, such as differences in distribution levels of severity of hearing loss, and additional intermediate scales taking into account losses <40 dB as well. Regardless of the scale used, hearing function decreases over time and therefore, close monitoring of hearing function at baseline and with each cycle of platinum therapy should be conducted.

]]>
<![CDATA[Clinical indicators of adrenal insufficiency following discontinuation of oral glucocorticoid therapy: A Danish population-based self-controlled case series analysis]]> https://www.researchpad.co/article/5c75abded5eed0c484d07dd2

Background

Biochemical adrenal insufficiency induced by glucocorticoid treatment is prevalent, but data on the clinical implications are sparse. We investigated clinical consequences of glucocorticoid-induced adrenal insufficiency after oral glucocorticoid cessation.

Methods

We conducted a Danish population-based self-controlled case series utilizing medical registries. In this design each individual serves as their own control allowing event rates to be compared as a function of time and treatment. Clinical indicators of adrenal insufficiency were defined as diagnoses of gastrointestinal symptoms, hypotension, cardiovascular collapse, syncope, hyponatremia, and hypoglycaemia. We included 286,680 persons who discontinued long-term (≥ 3 months) oral glucocorticoid treatment. We defined five risk periods and a reference period (before treatment): period 0 (on treatment), withdrawal period (1 month before and after cessation), followed by three consecutive 2 month-risk periods after withdrawal (periods 2–4).

Results

Median age at cessation was 69 years and 57% were female. Median treatment duration was 297 days and median cumulative dose was 3000 mg prednisolone equivalents. The incidence rates of hypotension, gastrointestinal symptoms, hypoglycemia and hyponatremia were increased in the withdrawal period compared to before treatment started (reference period). Incidence rate ratios comparing the withdrawal period with the reference period were 2.5 [95% confidence interval (CI): 1.4–4.3] for hypotension, 1.7 (95% CI: 1.6–1.9) for gastrointestinal symptoms, 2.2 (95% CI: 0.7–7.3) for hypoglycemia, and 1.5 (95% CI: 1.1–2.0) for hyponatremia. During 7 months of follow up, the rates of hypotension and gastrointestinal symptoms remained elevated compared to the reference period. Risk factors included use of antibiotics, increasing average daily dose of glucocorticoids, cumulative dose, and age.

Conclusion

Oral glucocorticoid withdrawal was associated with adverse outcomes attributable to adrenal insufficiency. Our study underscores the need for future research to establish evidence-based clinical guidance on management of patients who discontinue oral glucocorticoids.

]]>
<![CDATA[The incidence of post-intubation hypertension and association with repeated intubation attempts in the emergency department]]> https://www.researchpad.co/article/5c6b266fd5eed0c484289a8b

Background

Studies in the non-emergency department (ED) settings have reported the relationships of post-intubation hypertension with poor patient outcomes. While ED-based studies have examined post-intubation hypotension and its sequelae, little is known about, post-intubation hypertension and its risk factors in the ED settings. In this context, we aimed to identify the incidence of post-intubation hypertension in the ED, and to test the hypothesis that repeated intubation attempts are associated with an increased risk of post-intubation hypertension.

Methods

This study is a secondary analysis of the data from a multicenter prospective observational study of emergency intubations in 15 EDs from 2012 through 2016. The analytic cohort comprised all adult non-cardiac-arrest patients undergoing orotracheal intubation without pre-intubation hypotension. The primary exposure was the repeated intubation attempts, defined as ≥2 laryngoscopic attempts. The outcome was post-intubation hypertension defined as an increase in systolic blood pressure (sBP) of >20% along with a post-intubation sBP of >160 mmHg. To investigate the association of repeated intubation attempts with the risk of post-intubation hypertension, we fit multivariable logistic regression models adjusting for ten potential confounders and patient clustering within the EDs.

Results

Of 3,097 patients, the median age was 69 years, 1,977 (64.0%) were men, and 991 (32.0%) underwent repeated intubation attempts. Post-intubation hypertension was observed in 276 (8.9%). In the unadjusted model, the incidence of post-intubation hypertension did not differ between the patients with single intubation attempt and those with repeated attempts (8.5% versus 9.8%, unadjusted P = 0.24). By contrast, after adjusting for potential confounders and patient clustering in the random-effects model, the patients who underwent repeated intubation attempts had a significantly higher risk of post-intubation hypertension (OR, 1.56; 95% CI, 1.11–2.18; adjusted P = 0.01).

Conclusions

We found that 8.9% of patients developed post-intubation hypertension, and that repeated intubation attempts were significantly associated with a significantly higher risk of post-intubation hypertension in the ED.

]]>
<![CDATA[How many to sample? Statistical guidelines for monitoring animal welfare outcomes]]> https://www.researchpad.co/article/5c5b524fd5eed0c4842bc630

There is increasing scrutiny of the animal welfare impacts of all animal use activities, including agriculture, the keeping of companion animals, racing and entertainment, research and laboratory use, and wildlife management programs. A common objective of animal welfare monitoring is to quantify the frequency of adverse animal events (e.g., injuries or mortalities). The frequency of such events can be used to provide pass/fail grades for animal use activities relative to a defined threshold and to identify areas for improvement through research. A critical question in these situations is how many animals should be sampled? There are, however, few guidelines available for data collection or analysis, and consequently sample sizes can be highly variable. To address this question, we first evaluated the effect of sample size on precision and statistical power in reporting the frequency of adverse animal welfare outcomes. We next used these findings to assess the precision of published animal welfare investigations for a range of contentious animal use activities, including livestock transport, horse racing, and wildlife harvesting and capture. Finally, we evaluated the sample sizes required for comparing observed outcomes with specified standards through hypothesis testing. Our simulations revealed that the sample sizes required for reasonable levels of precision (i.e., proportional distance to the upper confidence interval limit (δ) of ≤ 0.50) are greater than those that have been commonly used for animal welfare assessments (i.e., >300). Larger sample sizes are required for adverse events with low frequency (i.e., <5%). For comparison with a required threshold standard, even larger samples sizes are required. We present guidelines, and an online calculator, for minimum sample sizes for use in future animal welfare assessments of animal management and research programs.

]]>
<![CDATA[A meta-analysis of anti-interleukin-13 monoclonal antibodies for uncontrolled asthma]]> https://www.researchpad.co/article/5c5ca306d5eed0c48441f014

More and more clinical trials have tried to assess the clinical benefit of anti-interleukin (IL)-13 monoclonal antibodies for uncontrolled asthma. The aim of this study is to evaluate the efficacy and safety of anti-IL-13 monoclonal antibodies for uncontrolled asthma. Major databases were searched for randomized controlled trials comparing the anti-IL-13 treatment and a placebo in uncontrolled asthma. Outcomes, including asthma exacerbation rate, forced expiratory volume in 1 second (FEV1), Asthma Quality of Life Questionnaire (AQLQ) scores, rescue medication use, and adverse events were extracted from included studies for systematic review and meta-analysis. Five studies involving 3476 patients and two anti-IL-13 antibodies (lebrikizumab and tralokinumab) were included in this meta-analysis. Compared to the placebo, anti-IL-13 treatments were associated with significant improvement in asthma exacerbation, FEV1 and AQLQ scores, and reduction in rescue medication use. Adverse events and serious adverse events were similar between two groups. Subgroup analysis showed patients with high periostin level had a lower risk of asthma exacerbation after receiving anti-IL-13 treatment. Our study suggests that anti-IL-13 monoclonal antibodies could improve the management of uncontrolled asthma. Periostin may be a good biomarker to detect the specific subgroup who could get better response to anti-IL-13 treatments. In view of blocking IL-13 alone is possibly not enough to achieve asthma control because of the overlapping pathophysiological roles of IL-13/IL-4 in inflammatory pathways, combined blocking of IL-13 and IL-4 with monoclonal antibodies may be more encouraging.

]]>