ResearchPad - adverse-events https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Safety of tunneled central venous catheters in pediatric hematopoietic stem cell recipients with severe primary immunodeficiency diseases]]> https://www.researchpad.co/article/elastic_article_14693 Tunneled central venous catheters (TCVCs) provide prolonged intravenous access for pediatric patients with severe primary immunodeficiency disease (PID) undergoing hematopoietic stem cell transplantation (HSCT). However, little is known about the epidemiology and clinical significance of TCVC-related morbidity in this particular patient group. We conducted the retrospective analysis of patients with severe PID who received percutaneous landmark-guided TCVC implantation prior to HSCT. We analyzed 92 consecutive TCVC implantations in 69 patients (median [interquartile range] age 3.0 [0–11] years) with severe combined immune deficiency (n = 39, 42.4%), chronic granulomatous disease (n = 17, 18.4%), and other rare PID syndromes (n = 36, 39.2%). The median length of TCVC observation was 144.1 (85.5–194.6) days with a total of 14,040 catheter days at risk (cdr). The overall rate of adverse events during catheter insertion was 17.4% (n = 16) and 25.0% during catheter dwell period (n = 23, catheter risk [CR] per 1000 cdr = 1.64). The most common complication was TCVC-related infection with an overall prevalence of 9.8% (n = 9, CR = 0.64), followed by late dislocation (n = 6, 6.5%, CR = 0.43), early dislocation (n = 4, 4.3%) and catheter dysfunction (n = 4, 4.3%, CR = 0.28). TCVCs are safe in children with severe PID undergoing HSCT with relatively low rates of TCVC-related infection.

]]>
<![CDATA[Late‐Onset Immunotherapy Toxicity and Delayed Autoantibody Changes: Checkpoint Inhibitor–Induced Raynaud's‐Like Phenomenon]]> https://www.researchpad.co/article/elastic_article_6602 Autoantibody analysis may provide insight into the mechanism, nature, and timing of immune‐related adverse events. This case report describes a case of immune checkpoint inhibitor‐induced late‐onset Raynaud's‐like phenomenon in a patient receiving combination immunotherapy.

]]>
<![CDATA[Efficacy of adjuvant chemotherapy with S-1 in stage II oral squamous cell carcinoma patients: A comparative study using the propensity score matching method]]> https://www.researchpad.co/article/N83ad1f15-cdbb-4f4c-8d9c-388a45a97cce

It has been reported that 20% of early-stage oral squamous cell carcinoma (OSCC) patients treated with surgery alone (SA) may exhibit postoperative relapse within 2–3 years and have poor prognoses. We aimed to determine the safety of S-1 adjuvant chemotherapy and the potential differences in the disease-free survival (DFS) between patients with T2N0 (stage II) OSCC treated with S-1 adjuvant therapy (S-1) and those treated with SA. This single-center retrospective cohort study was conducted at Kumamoto University, between April 2004 and March 2012, and included 95 patients with stage II OSCC. The overall cohort (OC), and propensity score-matched cohort (PSMC) were analyzed. In the OC, 71 and 24 patients received SA and S-1, respectively. The time to relapse (TTR), DFS, and overall survival were better in the S-1 group, but the difference was not significant. In the PSMC, 20 patients each received SA and S-1. The TTR was significantly lower in the S-1 group than in the SA group, while the DFS was significantly improved in the former. S-1 adjuvant chemotherapy may be more effective than SA in early-stage OSCC.

]]>
<![CDATA[Ivermectin as an adjuvant to anti-epileptic treatment in persons with onchocerciasis-associated epilepsy: A randomized proof-of-concept clinical trial]]> https://www.researchpad.co/article/N2a703e18-6320-408f-bd4d-1f677396d877

Introduction

Recent findings from onchocerciasis-endemic foci uphold that increasing ivermectin coverage reduces the epilepsy incidence, and anecdotal evidence suggests seizure frequency reduction in persons with onchocerciasis-associated epilepsy, when treated with ivermectin. We conducted a randomized clinical trial to assess whether ivermectin treatment decreases seizure frequency.

Methods

A proof-of-concept randomized clinical trial was conducted in the Logo health zone in the Ituri province, Democratic Republic of Congo, to compare seizure frequencies in onchocerciasis-infected persons with epilepsy (PWE) randomized to one of two treatment arms: the anti-epileptic drug phenobarbital supplemented with ivermectin, versus phenobarbital alone. The primary endpoint was defined as the probability of being seizure-free at month 4. A secondary endpoint was defined as >50% reduction in seizure frequency at month 4, compared to baseline. Both endpoints were analyzed using multiple logistic regression. In longitudinal analysis, the probability of seizure freedom during the follow-up period was assessed for both treatment arms by fitting a logistic regression model using generalized estimating equations (GEE).

Results

Ninety PWE enrolled between October and November 2017 were eligible for analysis. A multiple logistic regression analysis showed a borderline association between ivermectin treatment and being seizure-free at month 4 (OR: 1.652, 95% CI 0.975–2.799; p = 0.062). There was no significant difference in the probability of experiencing >50% reduction of the seizure frequency at month 4 between the two treatment arms. Also, treatment with ivermectin did not significantly increase the odds of being seizure-free during the individual follow-up visits.

Conclusion

Whether ivermectin has an added value in reducing the frequency of seizures in PWE treated with AED remains to be determined. A larger study in persons with OAE on a stable AED regimen and in persons with recent epilepsy onset should be considered to further investigate the potential beneficial effect of ivermectin treatment in persons with OAE.

Trial registration

Registration: www.clinicaltrials.gov; NCT03052998.

]]>
<![CDATA[Acute kidney injury and adverse renal events in patients receiving SGLT2-inhibitors: A systematic review and meta-analysis]]> https://www.researchpad.co/article/N25fa93a0-706a-4469-9476-c6f8ced4ff6a

Background

Sodium-glucose cotransporter-2 inhibitors (SGLT2is) represent a new class of oral hypoglycemic agents used in the treatment of type 2 diabetes mellitus. They have a positive effect on the progression of chronic kidney disease, but there is a concern that they might cause acute kidney injury (AKI).

Methods and findings

We conducted a systematic review and meta-analysis of the effect of SGLT2is on renal adverse events (AEs) in randomized controlled trials and controlled observational studies. PubMed, EMBASE, Cochrane library, and ClinicalTrials.gov were searched without date restriction until 27 September 2019. Data extraction was performed using a standardized data form, and any discrepancies were resolved by consensus. One hundred and twelve randomized trials (n = 96,722) and 4 observational studies with 5 cohorts (n = 83,934) with a minimum follow-up of 12 weeks that provided information on at least 1 adverse renal outcome (AKI, combined renal AE, or hypovolemia-related events) were included. In 30 trials, 410 serious AEs due to AKI were reported. SGLT2is reduced the odds of suffering AKI by 36% (odds ratio [OR] 0.64 [95% confidence interval (CI) 0.53–0.78], p < 0.001). A total of 1,089 AKI events of any severity (AEs and serious AEs [SAEs]) were published in 41 trials (OR 0.75 [95% CI 0.66–0.84], p < 0.001). Empagliflozin, dapagliflozin, and canagliflozin had a comparable benefit on the SAE and AE rate. AEs related to hypovolemia were more commonly reported in SGLT2i-treated patients (OR 1.20 [95% CI 1.10–1.31], p < 0.001). In the observational studies, 777 AKI events were reported. The odds of suffering AKI were reduced in patients receiving SGLT2is (OR 0.40 [95% CI 0.33–0.48], p < 0.001). Limitations of this study are the reliance on nonadjudicated safety endpoints, discrepant inclusion criteria and baseline hypoglycemic therapy between studies, inconsistent definitions of renal AEs and hypovolemia, varying follow-up times in different studies, and a lack of information on the severity of AKI (stages I–III).

Conclusions

SGLT2is reduced the odds of suffering AKI with and without hospitalization in randomized trials and the real-world setting, despite the fact that more AEs related to hypovolemia are reported.

]]>
<![CDATA[Prevalence of drug–drug interaction in atrial fibrillation patients based on a large claims data]]> https://www.researchpad.co/article/N8861b373-c994-4a77-acd1-a0d16f2f19bb

This study aimed to compare and determine the prevalence of drug–drug interaction (DDI) and bleeding rate in atrial fibrillation (AF) patients receiving anticoagulants in a clinical setting. We used large claims data of AF patients obtained from the Japan Medical Data Center. The prevalence of DDIs and cases leading to bleeding events were surveyed clinically relevant DDIs extracted from 1) reported from a spontaneous adverse event reporting system (Japanese Adverse Drug Events Report system; JADER) ≥4 patients; 2) DDIs cited in the package inserts of each anticoagulant (each combination assessed according to “Drug interaction 2015” list; 3) warfarin and quinolone antibiotics DDIs. DDIs were categorized the mechanisms for pharmacokinetic DDI (Cytochrome P450 (CYP) or transporter etc. that modulate blood concentration of anticoagulants)/pharmacodynamic DDI (combination with similar pharmacological actions) or both in the analysis for each patients’ prescriptions obtained from a claims data. AF patients were compared between cases with and without bleeding after administered of anticoagulants. Bleeding was observed in 220/3290 (6.7%) AF patients. The bleeding rate in patients with both pharmacokinetic and pharmacodynamic DDI mechanisms (26.3%) was higher than that in patients with either mechanism (8.6% and 9.2%, respectively) or without DDIs (4.9%). The odds ratio for bleeding in AF patients with both of pharmacokinetic and pharmacodynamic was (7.18 [4.69–11.00], p<0.001). Our study concluded multi mechanism based DDIs leads serious outcome as compared to that of single mechanism based DDIs in AF patients. We determined the prevalence and frequency of bleeding for anticoagulant-related DDIs. To manage DDIs, both pharmacokinetic and pharmacodynamic DDI mechanisms should be closely monitored for initial symptoms of bleeding within the first 3 months.

]]>
<![CDATA[Single-center retrospective study of the effectiveness and toxicity of the oral iron chelating drugs deferiprone and deferasirox]]> https://www.researchpad.co/article/5c9900fdd5eed0c484b95e7f

Background

Iron overload, resulting from blood transfusions in patients with chronic anemias, has historically been controlled with regular deferoxamine, but its parenteral requirement encouraged studies of orally-active agents, including deferasirox and deferiprone. Deferasirox, licensed by the US Food and Drug Administration in 2005 based upon the results of randomized controlled trials, is now first-line therapy worldwide. In contrast, early investigator-initiated trials of deferiprone were prematurely terminated after investigators raised safety concerns. The FDA declined market approval of deferiprone; years later, it licensed the drug as “last resort” therapy, to be prescribed only if first-line drugs had failed. We undertook to evaluate the long-term effectiveness and toxicities of deferiprone and deferasirox in one transfusion clinic.

Methods and findings

Under an IRB-approved study, we retrospectively inspected the electronic medical records of consented iron-loaded patients managed between 2009 and 2015 at The University Health Network (UHN), Toronto. We compared changes in liver and heart iron, adverse effects and other outcomes, in patients treated with deferiprone or deferasirox.

Results

Although deferiprone was unlicensed in Canada, one-third (n = 41) of locally-transfused patients had been switched from first-line, licensed therapies (deferoxamine or deferasirox) to regimens of unlicensed deferiprone. The primary endpoint of monitoring in iron overload, hepatic iron concentration (HIC), increased (worsened) during deferiprone monotherapy (mean 10±2–18±2 mg/g; p < 0.0003), exceeding the threshold for life-threatening complications (15 mg iron/g liver) in 50% patients. During deferasirox monotherapy, mean HIC decreased (improved) (11±1–6±1 mg/g; p < 0.0001). Follow-up HICs were significantly different following deferiprone and deferasirox monotherapies (p < 0.0000002). Addition of low-dose deferoxamine (<40 mg/kg/day) to deferiprone did not result in reductions of HIC to <15 mg/g (baseline 20±4 mg/g; follow-up, 18±4 mg/g; p < 0.2) or in reduction in the proportion of patients with HIC exceeding 15 mg/g (p < 0.2). During deferiprone exposure, new diabetes mellitus, a recognized consequence of inadequate iron control, was diagnosed in 17% patients, most of whom had sustained HICs exceeding 15 mg/g for years; one woman died after 13 months of a regimen of deferiprone and low-dose deferasirox. During deferiprone exposure, serum ALT increased over baseline in 65% patients. Mean serum ALT increased 6.6-fold (p < 0.001) often persisting for years. During deferasirox exposure, mean ALT was unchanged (p < 0.84). No significant differences between treatment groups were observed in the proportions of patients estimated to have elevated cardiac iron.

Conclusions

Deferiprone showed ineffectiveness and significant toxicity in most patients. Combination with low doses of first-line therapies did not improve the effectiveness of deferiprone. Exposure to deferiprone, over six years while the drug was unlicensed, in the face of ineffectiveness and serious toxicities, demands review of the standards of local medical practice. The limited scope of regulatory approval of deferiprone, worldwide, should restrict its exposure to the few patients genuinely unable to tolerate the two effective, first-line therapies.

]]>
<![CDATA[Differential completeness of spontaneous adverse event reports among hospitals/clinics, pharmacies, consumers, and pharmaceutical companies in South Korea]]> https://www.researchpad.co/article/5c6f14f7d5eed0c48467abbe

The differential pattern and characteristics of completeness in adverse event (AE) reports generated by hospitals/clinics, pharmacies, consumer and pharmaceutical companies remain unknown. Thus, we identified the characteristics of complete AE reports, compared with those of incomplete AE reports, using a completeness score. We used Korea Institute of Drug Safety and Risk Management-Korea Adverse Event Reporting System Database (KIDS-KD) between January 1, 2016 and December 31, 2016. The completeness score was determined out of a total of 100 points, based on the presence of information on temporal relationships, age and sex of patients, AE progress, name of reported medication, reporting group by profession, causality assessment, and informational text. AE reports were organized into four groups based on affiliation: hospitals/clinics, pharmacies, consumers, and pharmaceutical companies. Affiliations that had median completeness scores greater than 80 points were classified as ‘well-documented’ and these reports were further analyzed by logistic regression to estimate the adjusted odds ratios and 95% confidence intervals. We examined 228,848 individual reports and 735,745 drug-AE combinations. The median values of the completeness scores were the highest for hospitals/clinics (95 points), followed by those for consumers (85), pharmacies (75), and manufacturers (72). Reports with causality assessment of ‘certain’, ‘probable’, or ‘possible’ were more likely to be ‘well-documented’ than reports that had causality assessments of ‘unlikely’. Serious reports of AEs were positively associated with ‘well-documented’ reports and negatively associated with hospitals/clinics.

]]>
<![CDATA[Real-life experience of lusutrombopag for cirrhotic patients with low platelet counts being prepared for invasive procedures]]> https://www.researchpad.co/article/5c70675dd5eed0c4847c6f1c

Background and aims

The present study aimed to report our real-life experience of the TPO receptor agonist lusutrombopag for cirrhotic patients with low platelet counts.

Methods

We studied platelet counts in 1,760 cirrhotic patients undergoing invasive procedures at our hospital between January 2014 and December 2017. In addition, we studied 25 patients who were administered lusutrombopag before invasive procedures between June 2017 and January 2018. Effectiveness of lusutrombopag to raise platelet counts and to avoid transfusion and treatment-related adverse events were analyzed.

Results

In 1,760 cirrhotic patients without lusutrombopag prior to invasive procedures, proportion of patients whose platelet counts <50,000/μL and needed platelet transfusions were 66% (n = 27/41) for radiofrequency ablation, 43% (n = 21/49) for transarterial chemoembolization, and 55% (n = 21/38) for endoscopic injection sclerotherapy / endoscopic variceal ligation, respectively. In 25 cirrhotic patients treated by lusutrombopag prior to the invasive procedures, platelet counts significantly increased compared with baseline (82,000 ± 26,000 vs. 41,000 ± 11,000/μL) (p < 0.01). Out of 25 patients, only 4 patients (16%) needed platelet transfusion before the invasive procedures. The proportion of patients with low platelet count and who needed platelet transfusions was significantly low in patients treated with lusutrombopag compared to those not treated with lusutrombopag (16% (4/25) vs. 54% (69/128), p = 0.001). Platelet counts after lusutrombopag treatment and prior to invasive procedures were lower in patients with a baseline platelet count ≤30,000/μL (n = 8) compared with those with a baseline platelet count >30,000/μL (n = 17) (50,000 ± 20,000 vs 86,000 ± 26,000/μL, p = 0.002). Patients with a baseline platelet count ≤30,000/μL with spleen index (calculated by multiplying the transverse diameter by the vertical diameter measured by ultrasonography) ≥40 cm2 (n = 3) had a lower response rate to lusutrombopag compared to those with spleen index <40 cm2 (n = 5) (0% vs. 100%, p = 0.02). Hemorrhagic complication was not observed. Recurrence of portal thrombosis was observed and thrombolysis therapy was required in one patient who had prior history of thrombosis.

Conclusions

Lusutrombopag is an effective and safe drug for thrombocytopenia in cirrhotic patients, and can reduce the frequency of platelet transfusions.

]]>
<![CDATA[Thoracic spine manipulation for the management of mechanical neck pain: A systematic review and meta-analysis]]> https://www.researchpad.co/article/5c6dc9ced5eed0c48452a202

Objective

To investigate the role of thoracic spine manipulation (TSM) on pain and disability in the management of mechanical neck pain (MNP).

Data sources

Electronic databases PubMed, CINAHL, Pedro, Embase, AMED, the Cochrane Library, and clinicaltrials.gov were searched in January 2018.

Study selection

Eligible studies were completed RCTs, written in English, had at least 2 groups with one group receiving TSM, had at least one measure of pain or disability, and included patients with MNP of any duration. The search identified 1717 potential articles, with 14 studies meeting inclusion criteria.

Study appraisal and synthesis methods

Methodological quality was evaluated independently by two authors using the guidelines published by the Cochrane Collaboration. Pooled analyses were analyzed using a random-effects model with inverse variance methods to calculate mean differences (MD) and 95% confidence intervals for pain (VAS 0-100mm, NPRS 0-10pts; 0 = no pain) and disability (NDI and NPQ 0–100%; 0 = no disability).

Results

Across the included studies, there was increased risk of bias for inadequate provider and participant blinding. The GRADE approach demonstrated an overall level of evidence ranging from very low to moderate. Meta-analysis that compared TSM to thoracic or cervical mobilization revealed a significant effect favoring the TSM group for pain (MD -13.63; 95% CI: -21.79, -5.46) and disability (MD -9.93; 95% CI: -14.38, -5.48). Meta-analysis that compared TSM to standard care revealed a significant effect favoring the TSM group for pain (MD -13.21; 95% CI: -21.87, -4.55) and disability (MD -11.36; 95% CI: -18.93, -3.78) at short-term follow-up, and a significant effect for disability (MD -4.75; 95% CI: -6.54, -2.95) at long-term follow-up. Meta-analysis that compared TSM to cervical spine manipulation revealed a non-significant effect (MD 3.43; 95% CI: -7.26, 14.11) for pain without a distinction between immediate and short-term follow-up.

Limitations

The greatest limitation in this systematic review was the heterogeneity among the studies making it difficult to assess the true clinical benefit, as well as the overall level of quality of evidence.

Conclusions

TSM has been shown to be more beneficial than thoracic mobilization, cervical mobilization, and standard care in the short-term, but no better than cervical manipulation or placebo thoracic spine manipulation to improve pain and disability.

Trial registration

PROSPERO CRD42017068287

]]>
<![CDATA[Six-month follow up of a randomized clinical trial-phase I study in Indonesian adults and children: Safety and immunogenicity of Salmonella typhi polysaccharide-diphtheria toxoid (Vi-DT) conjugate vaccine]]> https://www.researchpad.co/article/5c6dc9b4d5eed0c48452a04e

Introduction

There is a high global incidence of typhoid fever, with an annual mortality rate of 200,000 deaths. Typhoid fever also affects younger children, particularly in resource-limited settings in endemic countries. Typhoid vaccination is an important prevention tool against typhoid fever. However, the available polysaccharide typhoid vaccines are not recommended for children under 2 years of age. A new typhoid conjugate Vi-diphtheria toxoid (Vi-DT) vaccine has been developed for infant immunization. We aimed to define the safety and immunogenicity of the Vi-DT vaccine among adults and children in Indonesia.

Methods

An observational, blinded, comparative, randomized, phase I safety study in two age de-escalating cohorts was conducted in East Jakarta, Indonesia, from April 2017 to February 2018. We enrolled 100 healthy subjects in 2 age groups: adults and children (18–40 and 2–5 years old). These groups were randomized into study groups (Vi-DT vaccine), and comparator groups (Vi-polysaccharide (Vi-PS) vaccine and another additional vaccine) which was administered in 4 weeks apart. Subjects were followed up to six months.

Result

One hundred healthy adults and children subjects completed the study. The Vi-DT and Vi-PS vaccines showed no difference in terms of intensity of any immediate local and systemic events within 30 minutes post-vaccination. Overall, pain was the most common local reaction, and muscle pain was the most common systemic reaction in the first 72 hours. No serious adverse events were deemed related to vaccine administration. The first and second doses of the Vi-DT vaccine induced seroconversion and higher geometric mean titers (GMT) in all subjects compared to that of baseline. However, in terms of GMT, the second dose of Vi-DT did not induce a booster response.

Conclusion

The Vi-DT vaccine is safe and immunogenic in adults and children older than two years. A single dose of the vaccine is able to produce seroconversion and high GMT in all individuals.

]]>
<![CDATA[A comparison of the Muenster, SIOP Boston, Brock, Chang and CTCAEv4.03 ototoxicity grading scales applied to 3,799 audiograms of childhood cancer patients treated with platinum-based chemotherapy]]> https://www.researchpad.co/article/5c6f1527d5eed0c48467ae60

Childhood cancer patients treated with platinums often develop hearing loss and the degree is classified according to different scales globally. Our objective was to compare concordance between five well-known ototoxicity scales used for childhood cancer patients. Audiometric test results (n = 654) were evaluated longitudinally and graded according Brock, Chang, International Society of Pediatric Oncology (SIOP) Boston, Muenster scales and the U.S. National Cancer Institute Common Technology Criteria for Adverse Events (CTCAE) version 4.03. Adverse effects of grade 2, 3 and 4 are considered to reflect a degree of hearing loss sufficient to interfere with day-to-day communication (> = Chang grade 2a; > = Muenster grade 2b). We term this “deleterious hearing loss”. A total number of 3,799 audiograms were evaluated. The prevalence of deleterious hearing loss according to the last available audiogram of each patient was 59.3% (388/654) according to Muenster, 48.2% (315/653) according to SIOP, 40.5% (265/652) according to Brock, 40.3% (263/652) according to Chang, and 57.5% (300/522) according to CTCAEv4.03. Overall concordance between the scales ranged from ĸ = 0.636 (Muenster vs. Chang) to ĸ = 0.975 (Brock vs. Chang). Muenster detected hearing loss the earliest in time, followed by Chang, SIOP and Brock. Generally good concordance between the scales was observed but there is still diversity in definitions of functional outcomes, such as differences in distribution levels of severity of hearing loss, and additional intermediate scales taking into account losses <40 dB as well. Regardless of the scale used, hearing function decreases over time and therefore, close monitoring of hearing function at baseline and with each cycle of platinum therapy should be conducted.

]]>
<![CDATA[The incidence of post-intubation hypertension and association with repeated intubation attempts in the emergency department]]> https://www.researchpad.co/article/5c6b266fd5eed0c484289a8b

Background

Studies in the non-emergency department (ED) settings have reported the relationships of post-intubation hypertension with poor patient outcomes. While ED-based studies have examined post-intubation hypotension and its sequelae, little is known about, post-intubation hypertension and its risk factors in the ED settings. In this context, we aimed to identify the incidence of post-intubation hypertension in the ED, and to test the hypothesis that repeated intubation attempts are associated with an increased risk of post-intubation hypertension.

Methods

This study is a secondary analysis of the data from a multicenter prospective observational study of emergency intubations in 15 EDs from 2012 through 2016. The analytic cohort comprised all adult non-cardiac-arrest patients undergoing orotracheal intubation without pre-intubation hypotension. The primary exposure was the repeated intubation attempts, defined as ≥2 laryngoscopic attempts. The outcome was post-intubation hypertension defined as an increase in systolic blood pressure (sBP) of >20% along with a post-intubation sBP of >160 mmHg. To investigate the association of repeated intubation attempts with the risk of post-intubation hypertension, we fit multivariable logistic regression models adjusting for ten potential confounders and patient clustering within the EDs.

Results

Of 3,097 patients, the median age was 69 years, 1,977 (64.0%) were men, and 991 (32.0%) underwent repeated intubation attempts. Post-intubation hypertension was observed in 276 (8.9%). In the unadjusted model, the incidence of post-intubation hypertension did not differ between the patients with single intubation attempt and those with repeated attempts (8.5% versus 9.8%, unadjusted P = 0.24). By contrast, after adjusting for potential confounders and patient clustering in the random-effects model, the patients who underwent repeated intubation attempts had a significantly higher risk of post-intubation hypertension (OR, 1.56; 95% CI, 1.11–2.18; adjusted P = 0.01).

Conclusions

We found that 8.9% of patients developed post-intubation hypertension, and that repeated intubation attempts were significantly associated with a significantly higher risk of post-intubation hypertension in the ED.

]]>
<![CDATA[How many to sample? Statistical guidelines for monitoring animal welfare outcomes]]> https://www.researchpad.co/article/5c5b524fd5eed0c4842bc630

There is increasing scrutiny of the animal welfare impacts of all animal use activities, including agriculture, the keeping of companion animals, racing and entertainment, research and laboratory use, and wildlife management programs. A common objective of animal welfare monitoring is to quantify the frequency of adverse animal events (e.g., injuries or mortalities). The frequency of such events can be used to provide pass/fail grades for animal use activities relative to a defined threshold and to identify areas for improvement through research. A critical question in these situations is how many animals should be sampled? There are, however, few guidelines available for data collection or analysis, and consequently sample sizes can be highly variable. To address this question, we first evaluated the effect of sample size on precision and statistical power in reporting the frequency of adverse animal welfare outcomes. We next used these findings to assess the precision of published animal welfare investigations for a range of contentious animal use activities, including livestock transport, horse racing, and wildlife harvesting and capture. Finally, we evaluated the sample sizes required for comparing observed outcomes with specified standards through hypothesis testing. Our simulations revealed that the sample sizes required for reasonable levels of precision (i.e., proportional distance to the upper confidence interval limit (δ) of ≤ 0.50) are greater than those that have been commonly used for animal welfare assessments (i.e., >300). Larger sample sizes are required for adverse events with low frequency (i.e., <5%). For comparison with a required threshold standard, even larger samples sizes are required. We present guidelines, and an online calculator, for minimum sample sizes for use in future animal welfare assessments of animal management and research programs.

]]>
<![CDATA[A meta-analysis of anti-interleukin-13 monoclonal antibodies for uncontrolled asthma]]> https://www.researchpad.co/article/5c5ca306d5eed0c48441f014

More and more clinical trials have tried to assess the clinical benefit of anti-interleukin (IL)-13 monoclonal antibodies for uncontrolled asthma. The aim of this study is to evaluate the efficacy and safety of anti-IL-13 monoclonal antibodies for uncontrolled asthma. Major databases were searched for randomized controlled trials comparing the anti-IL-13 treatment and a placebo in uncontrolled asthma. Outcomes, including asthma exacerbation rate, forced expiratory volume in 1 second (FEV1), Asthma Quality of Life Questionnaire (AQLQ) scores, rescue medication use, and adverse events were extracted from included studies for systematic review and meta-analysis. Five studies involving 3476 patients and two anti-IL-13 antibodies (lebrikizumab and tralokinumab) were included in this meta-analysis. Compared to the placebo, anti-IL-13 treatments were associated with significant improvement in asthma exacerbation, FEV1 and AQLQ scores, and reduction in rescue medication use. Adverse events and serious adverse events were similar between two groups. Subgroup analysis showed patients with high periostin level had a lower risk of asthma exacerbation after receiving anti-IL-13 treatment. Our study suggests that anti-IL-13 monoclonal antibodies could improve the management of uncontrolled asthma. Periostin may be a good biomarker to detect the specific subgroup who could get better response to anti-IL-13 treatments. In view of blocking IL-13 alone is possibly not enough to achieve asthma control because of the overlapping pathophysiological roles of IL-13/IL-4 in inflammatory pathways, combined blocking of IL-13 and IL-4 with monoclonal antibodies may be more encouraging.

]]>
<![CDATA[A test of positive suggestions about side effects as a way of enhancing the analgesic response to NSAIDs]]> https://www.researchpad.co/article/5c37b79dd5eed0c48449066e

Side effects are frequent in pharmacological pain management, potentially preceding analgesia and limiting drug tolerability. Discussing side effects is part of informed consent, yet can favor nocebo effects. This study aimed to test whether a positive suggestion regarding side effects, which could act as reminders of the medication having been absorbed, might favor analgesia in a clinical interaction model. Sixty-six healthy males participated in a study “to validate pupillometry as an objective measure of analgesia”. Participants were unknowingly randomized double-blind to positive vs control information about side effects embedded in a video regarding the study drugs. Sequences of moderately painful heat stimuli applied before and after treatment with diclofenac and atropine served to evaluate analgesia. Atropine was deceptively presented as a co-analgesic, but used to induce side effects. Adverse events (AE) were collected with the General Assessment of Side Effects (GASE) questionnaire prior to the second induced pain sequence. Debriefing fully informed participants regarding the purpose of the study and showed them the two videos.The combination of medication led to significant analgesia, without a between-group difference. Positive information about side effects increased the attribution of AE to the treatment compared to the control information. The total GASE score was correlated with analgesia, i.e., the more AEs reported, the stronger the analgesia. Interestingly, there was a significant between-groups difference on this correlation: the GASE score and analgesia correlated only in the positive information group. This provides evidence for a selective link between AEs and pain relief in the group who received the suggestion that AEs could be taken as a sign “that help was on the way”. During debriefing, 65% of participants said they would prefer to receive the positive message in a clinical context. Although the present results cannot be translated immediately to clinical pain conditions, they do indicate the importance of testing this type of modulation in a clinical context.

]]>
<![CDATA[Efficacy, safety, and resistance profile of osimertinib in T790M mutation-positive non-small cell lung cancer in real-world practice]]> https://www.researchpad.co/article/5c3fa577d5eed0c484ca4b0e

The efficacy and safety of osimertinib were demonstrated in clinical trials; however, real-world clinical data, particularly the resistance profile, are limited. Here, we investigated the efficacy, safety, and resistance profile of osimertinib in real-world practice. We reviewed medical records of T790M mutation-positive lung cancer patients who started osimertinib between February 2016 and June 2017. Molecular pathologic data of biopsy samples obtained after acquisition of resistance to osimertinib were also analyzed. The study included 23 patients with a median age of 59 years. The median follow-up duration was 11.9 months (IQR, 4.7–15.8). Objective response was achieved in 17 (73.9%) patients, and the disease was controlled in 22 (95.7%) patients. Median progression-free survival (PFS) was 7.4 months (95% CI, 3.6–11.0). Adverse events were minimal except for one case of pneumonitis. Of 14 patients experiencing disease progression, 10 underwent re-biopsy. The T790M mutation disappeared in seven patients (70%), and one showed wild-type conversion. PFS was shorter in the T790M-loss group than in the T790M-persistent group (4.4 vs. 7.7 months). Two patients with small cell transformation responded well to subsequent chemotherapy. One patient developed a C797S mutation that became undetectable after two cycles of gemcitabine and cisplatin followed by six cycles of pembrolizumab, after which the patient responded well to osimertinib. In conclusion, osimertinib showed favorable efficacy and safety in real-world practice comparable to those observed in clinical trials. Repeat biopsy after the acquisition of resistance to osimertinib is helpful to direct further treatment strategies.

]]>
<![CDATA[Risk interval analysis of emergency room visits following colonoscopy in patients with inflammatory bowel disease]]> https://www.researchpad.co/article/5c3fa587d5eed0c484ca5700

Background and aims

Prior studies suggest that colonoscopy may exacerbate inflammatory bowel disease (IBD) symptoms. Thus, our study aimed to determine risk of emergency room (ER) visits associated with colonoscopy among IBD patients and evaluate potential modifiers of this risk.

Methods

The study population included IBD patients in the Multi-Payer Claims Database who were >20 years old and had a colonoscopy from 2007–2010. We used a self-controlled risk interval design and mixed-effects Poisson regression models to calculate risk ratios (RR) and 95% confidence intervals (CI) comparing the incidence of ER visits in the 1–4 weeks following colonoscopy (risk interval) to the incidence of ER visits in the 7–10 weeks after colonoscopy (control interval). We also conducted stratified analyses by patient characteristics, bowel preparation type, and medication.

Results

There were 212,205 IBD patients with at least 1 colonoscopy from 2007–2010, and 3,699 had an ER visit during the risk and/or control interval. The risk of an ER visit was higher in the 4-week risk interval following colonoscopy compared to the control interval (RR = 1.24; 95% CI: 1.17–1.32). The effect was strongest in those <41 years old (RR = 1.60; 95% CI: 1.21–2.11), in women (RR = 1.32; 95% CI: 1.21–1.44), and in those with sodium phosphate bowel preparation (RR = 2.09; 95% CI: 1.02–4.29). Patients using immunomodulators had no increased risk of ER visits (RR = 0.75; 95% CI: 0.35–1.59).

Conclusions

Our results suggest that there is an increased risk of ER visits following colonoscopy among IBD patients, but that immunomodulators and mild bowel preparation agents may mitigate this risk.

]]>
<![CDATA[PathFX provides mechanistic insights into drug efficacy and safety for regulatory review and therapeutic development]]> https://www.researchpad.co/article/5c141eabd5eed0c484d27adc

Failure to demonstrate efficacy and safety issues are important reasons that drugs do not reach the market. An incomplete understanding of how drugs exert their effects hinders regulatory and pharmaceutical industry projections of a drug’s benefits and risks. Signaling pathways mediate drug response and while many signaling molecules have been characterized for their contribution to disease or their role in drug side effects, our knowledge of these pathways is incomplete. To better understand all signaling molecules involved in drug response and the phenotype associations of these molecules, we created a novel method, PathFX, a non-commercial entity, to identify these pathways and drug-related phenotypes. We benchmarked PathFX by identifying drugs’ marketed disease indications and reported a sensitivity of 41%, a 2.7-fold improvement over similar approaches. We then used PathFX to strengthen signals for drug-adverse event pairs occurring in the FDA Adverse Event Reporting System (FAERS) and also identified opportunities for drug repurposing for new diseases based on interaction paths that associated a marketed drug to that disease. By discovering molecular interaction pathways, PathFX improved our understanding of drug associations to safety and efficacy phenotypes. The algorithm may provide a new means to improve regulatory and therapeutic development decisions.

]]>
<![CDATA[Modulation of tactile perception by Virtual Reality distraction: The role of individual and VR-related factors]]> https://www.researchpad.co/article/5c0ed772d5eed0c484f14165

Background

Virtual reality (VR) has shown to be an effective distraction method in health care. However, questions remain regarding individual and VR-related factors that may modulate the effect of VR.

Purpose

To explore the effect of VR distraction on tactile perception thresholds in healthy volunteers, in relation to personal characteristics and interactivity of VR applications.

Methods

A randomized three way cross-over study was conducted to investigate the effects of active and passive VR applications in 50 healthy participants. Main outcome measures were monofilament detection thresholds (MDT) and electrical detection thresholds (EDT). Personal characteristics (e.g. age, gender, susceptibility for immersion) and immersion in the VR conditions were analyzed for their effect on VR induced threshold differences.

Results

The use of VR caused a significant increase in both MDT and EDT compared to the control condition (MDT: F (2, 76) = 20.174, p < 0.001; EDT F (2, 76) = 6.907, p = 0.002). Furthermore, a significant difference in favour of active VR compared to passive VR was found in MDT (p = 0.012), but not in EDT. No significant gender effect was found. There was a significant positive correlation between age and active VR distraction (r = 0.333, p = 0.018). Immersion in the VR world was positively correlated with the effect of VR, whereas visualization and daydreaming were negatively correlated with VR effects.

Conclusion

VR increased tactile perception thresholds, with active VR having the largest effect. Results indicate that the efficacy of VR may increase with increasing age. Gender did not affect VR susceptibility.

]]>