ResearchPad - albumins https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Dialysis timing may be deferred toward very late initiation: An observational study]]> https://www.researchpad.co/article/elastic_article_14499 The optimal timing to initiate dialysis among patients with an estimated glomerular filtration rate (eGFR) of <5 mL/min/1.73 m2 is unknown. We hypothesized that dialysis initiation time can be deferred in this population even with high uremic burden. A case-crossover study with case (0–30 days before dialysis initiation [DI]) and control (90–120 days before DI) periods was conducted in 1,079 hemodialysis patients aged 18–90 years at China Medical University Hospital between 2006 and 2015. The uremic burden was quantified based on 7 uremic indicators that reached the predefined threshold in case period, namely hemoglobin, serum albumin, blood urea nitrogen, serum creatinine, potassium, phosphorus, and bicarbonate. Dialysis timing was classified as standard (met 0–2 uremic indicators), late (3–5 indicators), and very late (6–7 indicators). Median eGFR-DI of the 1,079 patients was 3.4 mL/min/1.73 m2 and was 2.7 mL/min/1.73 m2 in patients with very late initiation. The median follow-up duration was 2.42 years. Antibiotics, diuretics, antihypertensive medications, and non-steroidal anti-inflammatory drugs (NSAIDs) were more prevalently used during the case period. The fully adjusted hazards ratios of all-cause mortality for the late and very late groups were 0.97 (95% confidence interval 0.76–1.24) and 0.83 (0.61–1.15) compared with the standard group. It is safe to defer dialysis initiation among patients with chronic kidney disease (CKD) having an eGFR of <5 mL/min/1.73 m2 even when patients having multiple biochemical uremic burdens. Coordinated efforts in acute infection prevention, optimal fluid management, and prevention of accidental exposure to NSAIDs are crucial to prolong the dialysis-free survival.

]]>
<![CDATA[Placental transfer of Letermovir &amp; Maribavir in the <i>ex vivo</i> human cotyledon perfusion model. New perspectives for <i>in utero</i> treatment of congenital cytomegalovirus infection]]> https://www.researchpad.co/article/elastic_article_11236 Congenital cytomegalovirus infection can lead to severe sequelae. When fetal infection is confirmed, we hypothesize that fetal treatment could improve the outcome. Maternal oral administration of an effective drug crossing the placenta could allow fetal treatment. Letermovir (LMV) and Maribavir (MBV) are new CMV antivirals, and potential candidates for fetal treatment.MethodsThe objective was to investigate the placental transfer of LMV and MBV in the ex vivo method of the human perfused cotyledon. Term placentas were perfused, in an open-circuit model, with LMV or MBV at concentrations in the range of clinical peak plasma concentrations. Concentrations were measured using ultraperformance liquid chromatography coupled with tandem mass spectrometry. Mean fetal transfer rate (FTR) (fetal (FC) /maternal concentration), clearance index (CLI), accumulation index (AI) (retention of each drug in the cotyledon tissue) were measured. Mean FC were compared with half maximal effective concentrations of the drugs (EC50(LMV) and EC50(MBV)).ResultsFor LMV, the mean FC was (± standard deviation) 1.1 ± 0.2 mg/L, 1,000-fold above the EC50(LMV). Mean FTR, CLI and AI were 9 ± 1%, 35 ± 6% and 4 ± 2% respectively. For MBV, the mean FC was 1.4 ± 0.2 mg/L, 28-fold above the EC50(MBV). Mean FTR, CLI and AI were 10 ± 1%, 50 ± 7% and 2 ± 1% respectively.ConclusionsDrugs’ concentrations in the fetal side should be in the range for in utero treatment of fetuses infected with CMV as the mean FC was superior to the EC50 for both molecules. ]]> <![CDATA[A modified arginine-depleting enzyme NEI-01 inhibits growth of pancreatic cancer cells]]> https://www.researchpad.co/article/elastic_article_11227 Arginine deprivation cancer therapy targets certain types of malignancies with positive result in many studies and clinical trials. NEI-01 was designed as a novel arginine-depleting enzyme comprising an albumin binding domain capable of binding to human serum albumin to lengthen its half-life. In the present work, NEI-01 is shown to bind to serum albumin from various species, including mice, rat and human. Single intraperitoneal administration of NEI-01 to mice reduced plasma arginine to undetectable level for at least 9 days. Treatment of NEI-01 specifically inhibited cell viability of MIA PaCa-2 and PANC-1 cancer cell lines, which were ASS1 negative. Using a human pancreatic mouse xenograft model, NEI-01 treatment significantly reduced tumor volume and weight. Our data provides proof of principle for a cancer treatment strategy using NEI-01.

]]>
<![CDATA[Intra-individual variation of particles in exhaled air and of the contents of Surfactant protein A and albumin]]> https://www.researchpad.co/article/N3daed577-6f93-4f19-9dc8-54ce3f8d7d6e

Introduction

Particles in exhaled air (PEx) provide samples of respiratory tract lining fluid from small airways containing, for example, Surfactant protein A (SP-A) and albumin, potential biomarkers of small airway disease. We hypothesized that there are differences between morning, noon, and afternoon measurements and that the variability of repeated measurements is larger between days than within days.

Methods

PEx was obtained in sixteen healthy non-smoking adults on 11 occasions, within one day and between days. SP-A and albumin were quantified by ELISA. The coefficient of repeatability (CR), intraclass correlation coefficient (ICC), and coefficient of variation (CV) were used to assess the variation of repeated measurements.

Results

SP-A and albumin increased significantly from morning towards the noon and afternoon by 13% and 25% on average, respectively, whereas PEx number concentration and particle mean mass did not differ significantly between the morning, noon and afternoon. Between-day CRs were not larger than within-day CRs.

Conclusions

Time of the day influences the contents of SP-A and albumin in exhaled particles. The variation of repeated measurements was rather high but was not influenced by the time intervals between measurements.

]]>
<![CDATA[An antibody-free sample pretreatment method for osteopontin combined with MALDI-TOF MS/MS analysis]]> https://www.researchpad.co/article/5c990284d5eed0c484b9802d

Osteopontin is an osteoblast-secreted protein with an aspartic acid-rich, highly phosphorylated, and glycosylated structure. Osteopontin can easily bind to integrins, tumor cells, extracellular matrix and calcium, and is related to bone diseases, various cancers, inflammation etc. Here, DEAE-Cibacron blue 3GA was used to extract recombinant osteopontin from human plasma, and to deplete abundant plasma proteins with an antibody-free method. Using selected buffer systems, osteopontin and human serum albumin could be bound to DEAE-Cibacron blue 3GA, while immunoglobulin G was excluded. The bound osteopontin could then be separated from albumin by using different sequential elution buffers. By this method, 1 μg/mL recombinant osteopontin could be separated from the major part of the most abundant proteins in human plasma. After trypsin digestion, the extracted osteopontin could be successfully detected and identified by MALDI-TOF MS/MS using the m/z 1854.898 peptide and its fragments.

]]>
<![CDATA[Determinants of corrosion resistance of Ti-6Al-4V alloy dental implants in an In Vitro model of peri-implant inflammation]]> https://www.researchpad.co/article/5c5ca287d5eed0c48441e57d

Background

Titanium (Ti) and its alloys possess high biocompatibility and corrosion resistance due to Ti ability to form a passive oxide film, i.e. TiO2, immediately after contact with oxygen. This passive layer is considered stable during function in the oral cavity, however, emerging information associate inflammatory peri-implantitis to vast increases in Ti corrosion products around diseased implants as compared to healthy ones. Thus, it is imperative to identify which factors in the peri-implant micro-environment may reduce Ti corrosion resistance.

Methods

The aim of this work is to simulate peri-implant inflammatory conditions in vitro to determine which factors affect corrosion susceptibility of Ti-6Al-4V dental implants. The effects of hydrogen peroxide (surrogate for reactive oxygen species, ROS, found during inflammation), albumin (a protein typical of physiological fluids), deaeration (to simulate reduced pO2 conditions during inflammation), in an acidic environment (pH 3), which is typical of inflammation condition, were investigated. Corrosion resistance of Ti-6Al-4V clinically-relevant acid etched surfaces was investigated by electrochemical techniques: Open Circuit Potential; Electrochemical Impedance Spectroscopy; and Anodic Polarization.

Results

Electrochemical tests confirmed that most aggressive conditions to the Ti-6Al-4V alloy were those typical of occluded cells, i.e. oxidizing conditions (H2O2), in the presence of protein and deaeration of the physiological medium.

Conclusions

Our results provide evidence that titanium’s corrosion resistance can be reduced by intense inflammatory conditions. This observation indicates that the micro-environment to which the implant is exposed during peri-implant inflammation is highly aggressive and may lead to TiO2 passive layer attack. Further investigation of the effect of these aggressive conditions on titanium dissolution is warranted.

]]>
<![CDATA[Decreased total iron binding capacity upon intensive care unit admission predicts red blood cell transfusion in critically ill patients]]> https://www.researchpad.co/article/5c521820d5eed0c484797475

Introduction

Red blood cell (RBC) transfusion is associated with poor clinical outcome in critically ill patients. We investigated the predictive value of biomarkers on intensive care units (ICU) admission for RBC transfusion within 28 days.

Methods

Critically ill patients (n = 175) who admitted to our ICU with organ dysfunction and an expected stay of ≥ 48 hours, without hemorrhage, were prospectively studied (derivation cohort, n = 121; validation cohort, n = 54). Serum levels of 12 biomarkers (hemoglobin, creatinine, albumin, interleukin-6 [IL-6], erythropoietin, Fe, total iron binding capacity [TIBC], transferrin, ferritin, transferrin saturation, folate, and vitamin B12) were measured upon ICU admission, days 7, 14, 21 and 28.

Results

Among the 12 biomarkers measured upon ICU admission, levels of hemoglobin, albumin, IL-6, TIBC, transferrin and ferritin were statistically different between transfusion and non-transfusion group. Of 6 biomarkers, TIBC upon ICU admission had the highest area under the curve value (0.835 [95% confidence interval] = 0.765–0.906) for predicting RBC transfusion (cut-off value = 234.5 μg/dL; sensitivity = 0.906, specificity = 0.632). This result was confirmed in validation cohort, whose sensitivity and specificity were 0.888 and 0.694, respectively. Measurement of these biomarkers every seven days revealed that albumin, TIBC and transferrin were statistically different between groups throughout hospitalization until 28 days. In validation cohort, patients in the transfusion group had significantly higher serum hepcidin levels than those in the non-transfusion group (P = 0.004). In addition, joint analysis across derivation and validation cohorts revealed that the serum IL-6 levels were higher in the transfusion group (P = 0.0014).

Conclusion

Decreased TIBC upon ICU admission has high predictive value for RBC transfusion unrelated to hemorrhage within 28 days.

]]>
<![CDATA[Evaluation of Nutrition Risk Screening Score 2002 (NRS) assessment in hospitalized chronic kidney disease patient]]> https://www.researchpad.co/article/5c536bd8d5eed0c484a49346

Background

Although chronic kidney disease (CKD) patients are particularly prone to malnutrition, systematic nutritional screening is rarely routinely performed during hospitalization. The primary aim of this study was to determine the prevalence of malnutrition (as captured by the nutritional screening score NRS) in hospitalized CKD patients and explore the impact of malnutrition on hospital mortality.

Methods

All patients admitted to the tertiary nephrology department of the University hospital of Bern Inselspital over a period of 12 months were included in this observational study. The risk for malnutrition was assessed within 24h of admission by the NRS. Demographic, clinical, and outcome data were extracted from the patient database. The primary outcome was in-hospital mortality. The secondary outcomes were length of hospitalization and hospitalization costs. Multilevel mixed-effect logistic regression model analysis was performed to determine the association of in-hospital mortality and risk of malnutrition (NRS score≥3).

Results

We included 696 eligible hospitalizations of 489 CKD patients. Hospitalized patients had a median age of 64 years (interquartile range (IQR), 52–72), 35.6% were at risk of malnutrition (NRS≥3). After adjustment for the identified confounders (Case weight, Barthel index, and CKD stage) multivariate analysis confirmed an independent and significant association between higher in-hospital mortality with NRS≥3 [OR 2.92 (95% CI: 1.33–6.39), P<0.001]. Furthermore, in multivariate analysis the risk of malnutrition was associated with longer length of hospitalization [Geometric mean ratio: 1.8 (95% CI: 1.5–2.0), p<0.001] and with increased hospitalization costs [Geometric mean ratio: 1.7 (95% CI: 1.5–1.9), p<0.001]).

Conclusions

Malnutrition in CKD patients, as captured by NRS>3, is highly prevalent among hospitalized CKD patient and associated with prolonged hospital stay and increased in-hospital mortality.

]]>
<![CDATA[Prevalence and treatment of gout among patients with chronic kidney disease in the Irish health system: A national study]]> https://www.researchpad.co/article/5c644870d5eed0c484c2e6d4

Background

Gout is a common inflammatory arthritis associated with adverse clinical outcomes. Under treatment is common in the general population. The aim of this study was to determine the prevalence of gout and its treatment among patients with chronic kidney disease (CKD).

Methods

We conducted a multi-centre cross sectional study of patients (n = 522) who attended specialist nephrology clinics in Ireland. Standardized data collection tool recorded clinical characteristics and medication use at clinic visits and kidney function was assessed with standardised creatinine measurements and Estimated Glomerular Filtration Rate (eGFR). The prevalence of gout and the corresponding use of urate lowering therapies (ULT) were determined. Multivariate logistic regression explored correlates of gout expressed as Odds Ratios (OR) and 95% Confidence Intervals (CI) adjusting for demographic and clinical characteristics.

Results

Overall prevalence of gout was 16.6% and increased significantly from 7.5% in Stage 1–2 CKD to 22.8% in stage 4–5 CKD, P< 0.005. Prevalence increased with age (P < 0.005) and was higher in men than women (19.1% versus 10.3% P< 0.005). Overall, 67.9% of gout patients with CKD were treated with ULT, and the percentage increased with advancing stage of CKD from 55.6% in Stage 1–2 to 77.4% in Stage 4–5, P<0.005. Multivariable modelling identified men (vs women), OR, 1.95 (0.95–4.03), serum albumin, OR 1.09 (1.02–1.16) per 1 g/L lower, poorer kidney function, OR 1.11 (1.01–1.22) per 5 ml/min/1.73m2 lower, and rising parathyroid hormone levels, OR 1.38 (1.08–1.77) per 50 pg/ml higher as disease correlates.

Conclusions

Gout is common in CKD and increases with worsening kidney function in the Irish health system. Over two thirds of patients with gout were receiving ULT, increasing to 77% of patients with advanced CKD. Greater awareness of gout in CKD, its treatment and the effectiveness of treatment strategies should be vigorously monitored to improve patient outcomes.

]]>
<![CDATA[The association of the difference in hemoglobin levels before and after hemodialysis with the risk of 1-year mortality in patients undergoing hemodialysis. Results from a nationwide cohort study of the Japanese Renal Data Registry]]> https://www.researchpad.co/article/5c40f767d5eed0c4843860a0

Background

Few clinical studies have directly examined the associations of hemoglobin (Hb) levels after hemodialysis (HD) and of the difference in Hb levels before and after HD (ΔHb) with patient outcomes. The present study aimed to determine ΔHb and post-HD Hb levels with nationwide data and to examine their associations with all-cause mortality in patients undergoing HD.

Methods

This study is based on data from 2008 and 2009 recorded in the Japanese Renal Data Registry. Study endpoints were all-cause mortality within 1-year. The ΔHb and post-HD Hb level as categorical variables using Cox regression for 1-year mortality, adjusting for potential confounders.

Results

The median ΔHb was 1.0 g/dl, and the post-HD Hb level was 11.3 g/d. The median pre-HD Hb level was 10.4 g/dl. The risk of mortality was lower with a ΔHb of 0 to 1.0 g/dl (adjusted hazard ratio [aHR], 0.90; 95% confidence interval [CI], 0.70–1.01) or > 1.0 g/dl (aHR, 0.73; 95% CI, 0.64–0.84) than with a ΔHb < 0 g/dl. The risk for mortality was also lower with a post-HD Hb of 10 to 11 g/dl (aHR, 0.82; 95% CI, 0.73–0.92), 11 to 12 g/dl (aHR, 0.77; 95% CI, 0.68–0.87), or > 12 g/dl (aHR, 0.77; 95% CI, 0.68–0.87) than with a post-HD Hb < 10 g/dl.

Conclusions

Both a low ΔHb and a low post-HD Hb level were associated with a higher risk of 1-year mortality.

]]>
<![CDATA[Purification of human butyrylcholinesterase from frozen Cohn fraction IV-4 by ion exchange and Hupresin affinity chromatography]]> https://www.researchpad.co/article/5c3fa5afd5eed0c484ca75ae

Human butyrylcholinesterase (HuBChE) is being developed as a therapeutic for protection from the toxicity of nerve agents. An enriched source of HuBChE is Cohn fraction IV-4 from pooled human plasma. For the past 40 years, purification of HuBChE has included affinity chromatography on procainamide-Sepharose. The present report supports a new affinity sorbent, Hupresin, for purification of HuBChE from Cohn fraction IV-4. Nine batches of 70–80 kg frozen Cohn fraction were extracted with water, filtered, and chromatographed on 30 L of Q-Ceramic ion exchange sorbent at pH 4.5. The 4% pure Q-eluent was pumped onto 4.2 L Hupresin, where contaminants were washed off with 0.3 M NaCl in 20 mM sodium phosphate pH 8.0, before 99% pure HuBChE was eluted with 0.1 M tetramethylammonium bromide. The average yield was 1.5 g of HuBChE from 80 kg Cohn paste. Recovery of HuBChE was reduced by 90% when the paste was stored at -20°C for 1 year, and reduced 100% when stored at 4°C for 24h. No reduction in HuBChE recovery occurred when paste was stored at -80°C for 3 months or 3 years. Hupresin and procainamide-Sepharose were equally effective at purifying HuBChE from Cohn fraction. HuBChE in Cohn fraction required 1000-fold purification to attain 99% purity, but 15,000-fold purification when the starting material was plasma. HuBChE (P06276) purified from Cohn fraction was a 340 kDa tetramer of 4 identical N-glycated subunits, stable for years in solution or as a lyophilized product.

]]>
<![CDATA[Bioprinted liver provides early insight into the role of Kupffer cells in TGF-β1 and methotrexate-induced fibrogenesis]]> https://www.researchpad.co/article/5c3667efd5eed0c4841a69e6

Hepatic fibrosis develops from a series of complex interactions among resident and recruited cells making it a challenge to replicate using standard in vitro approaches. While studies have demonstrated the importance of macrophages in fibrogenesis, the role of Kupffer cells (KCs) in modulating the initial response remains elusive. Previous work demonstrated utility of 3D bioprinted liver to recapitulate basic fibrogenic features following treatment with fibrosis-associated agents. In the present study, culture conditions were modified to recapitulate a gradual accumulation of collagen within the tissues over an extended exposure timeframe. Under these conditions, KCs were added to the model to examine their impact on the injury/fibrogenic response following cytokine and drug stimuli. A 28-day exposure to 10 ng/mL TGF-β1 and 0.209 μM methotrexate (MTX) resulted in sustained LDH release which was attenuated when KCs were incorporated in the model. Assessment of miR-122 confirmed early hepatocyte injury in response to TGF-β1 that appeared delayed in the presence of KCs, whereas MTX-induced increases in miR-122 were observed when KCs were incorporated in the model. Although the collagen responses were mild under the conditions tested to mimic early fibrotic injury, a global reduction in cytokines was observed in the KC-modified tissue model following treatment. Furthermore, gene expression profiling suggests KCs have a significant impact on baseline tissue function over time and an important modulatory role dependent on the context of injury. Although the number of differentially expressed genes across treatments was comparable, pathway enrichment suggests distinct, KC- and time-dependent changes in the transcriptome for each agent. As such, the incorporation of KCs and impact on baseline tissue homeostasis may be important in recapitulating temporal dynamics of the fibrogenic response to different agents.

]]>
<![CDATA[Nutritional risk index as a predictor of mortality in acutely decompensated heart failure]]> https://www.researchpad.co/article/5c1d5b9ed5eed0c4846ec2b2

Background

We investigated the role of nutritional risk index (NRI) in predicting 1-year mortality in patients with acute decompensated heart failure (ADHF).

Methods

Among 5,625 cohort patients enrolled in Korean Acute Heart Failure (KorAHF) Registry, a total of 5,265 patients who were possible to calculate NRI [NRI = (1.519 x serum albumin [g/dl]) + (41.7 x weight [kg]/ideal body weight [kg])] were enrolled. The patients were divided into 4 groups according to the NRI quartile; Q1 <89 (n = 1121, 69.9 ± 14.5 years, 632 males), Q2 89–95 (n = 1234, 69.7 ± 14.4 years, 677 males), Q3 95–100 (n = 1199, 68.8 ± 14.0 years, 849 males), Q4 >100 (n = 1711, 65.6 ± 14.5 years, 779 males). Primary end-point was all-cause mortality at 1-year clinical follow-up.

Results

The 1-year mortality was significantly increased as the NRI quartile decreased, and the lowest NRI quartile was associated with the highest 1-year mortality (Q1: 27.5% vs. Q2: 20.9% vs. Q3: 12.9% vs. Q4: 8.7%, linear p <0.001). On Kaplan-Meier survival analysis, the significant inter-quartile difference was observed (p <0.001 for all). In multivariate analysis using Cox proportional hazard regression, the lowest NRI quartile was an independent predictor of 1-year mortality in patients with ADHF.

Conclusions

Poor nutritional status as assessed by NRI and quartile grading of NRI was associated with 1-year mortality in Korean patients with ADHF. The assessment of nutritional status by NRI may provide additional prognostic information and thus would be useful in the risk stratification of the patients with ADHF.

]]>
<![CDATA[The predictive value of quantitative nucleic acid amplification detection of Clostridium difficile toxin gene for faecal sample toxin status and patient outcome]]> https://www.researchpad.co/article/5c117bd0d5eed0c48469a858

Background

Laboratory diagnosis of Clostridium difficile infection (CDI) remains unsettled, despite updated guidelines. We investigated the potential utility of quantitative data from a nucleic acid amplification test (NAAT) for C. difficile toxin gene (tg) for patient management.

Methods

Using data from the largest ever C. difficile diagnostic study (8853 diarrhoeal samples from 7335 patients), we determined the predicative value of C. difficile tgNAAT (Cepheid Xpert C.diff) low cycle threshold (CT) value for patient toxin positive status, CDI severity, mortality and CDI recurrence. Reference methods for CDI diagnosis were cytotoxicity assay (CTA) and cytotoxigenic culture (CTC).

Results

Of 1281 tgNAAT positive faecal samples, 713 and 917 were CTA and CTC positive, respectively. The median tgNAAT CT for patients who died was 25.5 vs 27.5 for survivors (p = 0.021); for toxin-positivity was 24.9 vs 31.6 for toxin-negative samples (p<0.001) and for patients with a recurrence episode was 25.6 vs 27.3 for those who did not have a recurrent episode (p = 0.111). Following optimal cut-off determination, low CT was defined as ≤25 and was significantly associated with a toxin-positive result (P<0.001, positive predictive value 83.9%), presence of PCR-ribotype 027 (P = 0.025), and mortality (P = 0.032). Recurrence was not associated with low CT (p 0.111).

Conclusions

Low tgNAAT CT could indicate CTA positive patients, have more severe infection, increased risk of mortality and possibly recurrence. Although, the limited specificity of tgNAAT means it cannot be used as a standalone test, it could augment a more timely diagnosis, and optimise management of these at-risk patients.

]]>
<![CDATA[Prognostic value of platelet count and lymphocyte to monocyte ratio combination in stage IV non-small cell lung cancer with malignant pleural effusion]]> https://www.researchpad.co/article/5b60039f463d7e38dd0d05b2

Introduction

A combination of platelet and lymphocyte to monocyte ratio (LMR) (abbreviated as COP-LMR) has been recently evaluated as systemic inflammatory marker for prognostication in lung cancer. While previous study on COP-LMR has evaluated its prognostic value in NSCLC patients who underwent curative resections, the combination of these two markers has not been evaluated in advanced NSCLC yet.

Objectives

In this study, we evaluated the prognostic value of COP-LMR in stage IV NSCLC with malignant pleural effusion under active anticancer treatment.

Methods

Between January 2012 and July 2016, 217 patients with stage IV NSCLC and MPE undergoing active anticancer treatment were selected for evaluation. If patients had both low LMR (< 2.47) and increased platelet (> 30.0 ×104 mm-3), they were assigned to COP-LMR group 2. Patients with one parameter were assigned to COP-LMR group 1. If none, patients were assigned to COP-LMR group 0.

Results

Median overall survival (OS) (P < 0.001), progression free survival (PFS) (P < 0.001) and histological feature (P = 0.003) showed significant differences among COP-LMR groups. For COP-LMR groups 0, 1 and 2, median survival times were 35.9, 14.7 and 7.4 months, respectively, while median progression free times were 19.2, 13.3 and 7.4 months, respectively. Older age, male, low albumin, high CRP and high COP-LMR (0 vs 1, P = 0.021, hazard ratio (HR): 1.822, 95% confidence interval (CI): 1.096–3.027 and 0 vs 2, P = 0.003, HR: 2.464, 95% CI: 1.373–4.421) were independent predictive factors for shorter OS. Age, sex, histology, albumin, or CRP had no significant influence on PFS. High COP-LMR was the significant factor in predicting shorter PFS (0 vs 1, P = 0.116 and 0 vs 2, P = 0.007, HR: 1.902, 95% CI: 1.194–3.028).

Conclusions

A combination of pretreatment LMR and platelet levels can be used to predict short survival in stage IV NSCLC patients who underwent active anticancer treatment.

]]>
<![CDATA[Individual patient variability with the application of the kidney failure risk equation in advanced chronic kidney disease]]> https://www.researchpad.co/article/5b297b81463d7e06147f8d71

The Kidney Failure Risk Equation (KFRE) predicts the need for dialysis or transplantation using age, sex, estimated glomerular filtration rate (eGFR), and urine albumin to creatinine ratio (ACR). The eGFR and ACR have known biological and analytical variability. We examined the effect of biological and analytical variability of eGFR and ACR on the 2-year KFRE predicted kidney failure probabilities using single measure and the average of repeat measures of simulated eGFR and ACR. Previously reported values for coefficient of variation (CV) for ACR and eGFR were used to calculate day to day variability. Variation was also examined with outpatient laboratory data from patients with an eGFR between 15 and 50 mL/min/1.72 m2. A web application was developed to calculate and model day to day variation in risk. The biological and analytical variability related to ACR and eGFR lead to variation in the predicted probability of kidney failure. A male patient age 50, ACR 30 mg/mmol and eGFR 25, had a day to day variation in risk of 7% (KFRE point estimate: 17%, variability range 14% to 21%). The addition of inter laboratory variation due to different instrumentation increased the variability to 9% (KFRE point estimate 17%, variability range 13% to 22%). Averaging of repeated measures of eGFR and ACR significantly decreased the variability (KFRE point estimate 17%, variability range 15% to 19%). These findings were consistent when using outpatient laboratory data which showed that most patients had a KFRE 2-year risk variability of ≤ 5% (79% of patients). Approximately 13% of patients had variability from 5–10% and 8% had variability > 10%. The mean age (SD) of this cohort was 64 (15) years, 36% were females, the mean (SD) eGFR was 32 (10) ml/min/1.73m2 and median (IQR) ACR was 22.7 (110). Biological and analytical variation intrinsic to the eGFR and ACR may lead to a substantial degree of variability that decreases with repeat measures. Use of a web application may help physicians and patients understand individual patient’s risk variability and communicate risk (https://mccudden.shinyapps.io/kfre_app/). The web application allows the user to alter age, gender, eGFR, ACR, CV (for both eGFR and ACR) as well as units of measurements for ACR (g/mol versus mg/g).

]]>
<![CDATA[Association between Hypoalbuminaemia and Mortality in Patients with Community-Acquired Bacteraemia Is Primarily Related to Acute Disorders]]> https://www.researchpad.co/article/5989da6aab0ee8fa60b92c29

We sought to investigate whether hypoalbuminaemia was mainly caused by acute or chronic factors in patients with community-acquired bacteraemia. In this population-based study, we considered 1844 adult cases of community-acquired bacteraemia that occurred in Funen, Denmark between 2000 and 2008. We used a stepwise prognostic predisposition-insult-response-organ dysfunction (PIRO) logistic regression model by initially including age and comorbidity, then added bacterial species, and finally sepsis severity. The models were furthermore analysed using receiver operating characteristic (ROC) curves. Outcomes comprised mortality incidence on days 0–30 and 31–365 after the bacteraemia episode. Each step was performed with and without baseline albumin level measured on the date of bacteraemia. In 422 patients, their latest albumin measurement taken 8–30 days before the date of bacteraemia was also used in the analysis together with the baseline albumin level. For each decrease of 1g/L in plasma albumin level, the odds ratios (95% confidence intervals) of mortality in the period of 0–30 days after bacteraemia were 0.86 (0.84–0.88) in both predisposition (P) and predisposition-insult (PI) models and 0.87 (0.85–0.89) in the full PIRO-model. The AUC values were 0.78 and 0.66 for mortality in the period of 0–30 days in the model comprising only predisposition factors with and without albumin levels added as a factor, respectively. The AUC values in the full PIRO-model were 0.81 and 0.73 with and without consideration of albumin levels, respectively. A higher proportion of patients died within 30 days if there was a decrease in the albumin level between days 8 and 30 before bacteraemia and the actual bacteraemia date. A single plasma albumin measurement on the bacteraemia date was a better prognostic predictor of short-term mortality than the sepsis severity score.

]]>
<![CDATA[Patterns of intravenous fluid resuscitation use in adult intensive care patients between 2007 and 2014: An international cross-sectional study]]> https://www.researchpad.co/article/5989db5aab0ee8fa60bdf78c

Background

In 2007, the Saline versus Albumin Fluid Evaluation—Translation of Research Into Practice Study (SAFE-TRIPS) reported that 0.9% sodium chloride (saline) and hydroxyethyl starch (HES) were the most commonly used resuscitation fluids in intensive care unit (ICU) patients. Evidence has emerged since 2007 that these fluids are associated with adverse patient-centred outcomes. Based on the published evidence since 2007, we sought to determine the current type of fluid resuscitation used in clinical practice and the predictors of fluid choice and determine whether these have changed between 2007 and 2014.

Methods

In 2014, an international, cross-sectional study was conducted (Fluid-TRIPS) to document current patterns of intravenous resuscitation fluid use and determine factors associated with fluid choice. We examined univariate and multivariate associations between patients and prescriber characteristics, geographical region and fluid type. Additionally, we report secular trends of resuscitation fluid use in a cohort of ICUs that participated in both the 2007 and 2014 studies. Regression analysis were conducted to determine changes in the administration of crystalloid or colloid between 2007 and 2014.

Findings

In 2014, a total of 426 ICUs in 27 countries participated. Over the 24 hour study day, 1456/6707 (21.7%) patients received resuscitation fluid during 2716 resuscitation episodes. Crystalloids were administered to 1227/1456 (84.3%) patients during 2208/2716 (81.3%) episodes and colloids to 394/1456 (27.1%) patients during 581/2716 (21.4%) episodes. In multivariate analyses, practice significantly varied between geographical regions. Additionally, patients with a traumatic brain injury were less likely to receive colloid when compared to patients with no trauma (adjusted OR 0.24; 95% CI 0.1 to 0.62; p = 0.003). Patients in the ICU for one or more days where more likely to receive colloid compared to patients in the ICU on their admission date (adjusted OR 1.75; 95% CI 1.27 to 2.41; p = <0.001).

For secular trends in fluid resuscitation, 84 ICUs in 17 countries contributed data. In 2007, 527/1663 (31.7%) patients received fluid resuscitation during 1167 episodes compared to 491/1763 (27.9%) patients during 960 episodes in 2014. The use of crystalloids increased from 498/1167 (42.7%) in 2007 to 694/960 (72.3%) in 2014 (odds ratio (OR) 3.75, 95% confidence interval (CI) 2.95 to 4.77; p = <0.001), primarily due to a significant increase in the use of buffered salt solutions. The use of colloids decreased from 724/1167 (62.0%) in 2007 to 297/960 (30.9%) in 2014 (OR 0.29, 95% CI 0.19 to 0.43; p = <0.001), primarily due to a decrease in the use of HES, but an overall increase in the use of albumin.

Conclusions

Clinical practices of intravenous fluid resuscitation have changed between 2007 and 2014. Geographical location remains a strong predictor of the type of fluid administered for fluid resuscitation. Overall, there is a preferential use of crystalloids, specifically buffered salt solutions, over colloids. There is now an imperative to conduct a trial determining the safety and efficacy of these fluids on patient-centred outcomes.

Trial registration

Clinicaltrials.gov: Fluid-Translation of research into practice study (Fluid-TRIPS) NCT02002013

]]>
<![CDATA[Improved Pancreatic Adenocarcinoma Diagnosis in Jaundiced and Non-Jaundiced Pancreatic Adenocarcinoma Patients through the Combination of Routine Clinical Markers Associated to Pancreatic Adenocarcinoma Pathophysiology]]> https://www.researchpad.co/article/5989dad8ab0ee8fa60bb8b64

Background

There is still no reliable biomarker for the diagnosis of pancreatic adenocarcinoma. Carbohydrate antigen 19–9 (CA 19–9) is a tumor marker only recommended for pancreatic adenocarcinoma follow-up. One of the clinical problems lies in distinguishing between this cancer and other benign pancreatic diseases such as chronic pancreatitis. In this study we will assess the value of panels of serum molecules related to pancreatic cancer physiopathology to determine whether alone or in combination could help to discriminate between these two pathologies.

Methods

CA 19–9, carcinoembryonic antigen (CEA), C-reactive protein, albumin, insulin growth factor-1 (IGF-1) and IGF binding protein-3 were measured using routine clinical analyzers in a cohort of 47 pancreatic adenocarcinoma, 20 chronic pancreatitis and 15 healthy controls.

Results

The combination of CA 19–9, IGF-1 and albumin resulted in a combined area under the curve (AUC) of 0.959 with 93.6% sensitivity and 95% specificity, much higher than CA 19–9 alone. An algorithm was defined to classify the patients as chronic pancreatitis or pancreatic cancer with the above specificity and sensitivity. In an independent validation group of 20 pancreatic adenocarcinoma and 13 chronic pancreatitis patients, the combination of the four molecules classified correctly all pancreatic adenocarcinoma and 12 out of 13 chronic pancreatitis patients.

Conclusions

Although this panel of markers should be validated in larger cohorts, the high sensitivity and specificity values and the convenience to measure these parameters in clinical laboratories shows great promise for improving pancreatic adenocarcinoma diagnosis.

]]>
<![CDATA[Favipiravir pharmacokinetics in Ebola-Infected patients of the JIKI trial reveals concentrations lower than targeted]]> https://www.researchpad.co/article/5989db53ab0ee8fa60bdcf07

Background

In 2014–2015, we assessed favipiravir tolerance and efficacy in patients with Ebola virus (EBOV) disease (EVD) in Guinea (JIKI trial). Because the drug had never been used before for this indication and that high concentrations of the drugs were needed to achieve antiviral efficacy against EBOV, a pharmacokinetic model had been used to propose relevant dosing regimen. Here we report the favipiravir plasma concentrations that were achieved in participants in the JIKI trial and put them in perspective with the model-based targeted concentrations.

Methods and findings

Pre-dose drug concentrations were collected at Day-2 and Day-4 of treatment in 66 patients of the JIKI trial and compared to those predicted by the model taking into account patient’s individual characteristics. At Day-2, the observed concentrations were slightly lower than the model predictions adjusted for patient’s characteristics (median value of 46.1 versus 54.3 μg/mL for observed and predicted concentrations, respectively, p = 0.012). However, the concentrations dropped at Day-4, which was not anticipated by the model (median values of 25.9 and 64.4 μg/mL for observed and predicted concentrations, respectively, p<10−6). There was no significant relationship between favipiravir concentrations and EBOV viral kinetics or mortality.

Conclusions

Favipiravir plasma concentrations in the JIKI trial failed to achieve the target exposure defined before the trial. Furthermore, the drug concentration experienced an unanticipated drop between Day-2 and Day-4. The origin of this drop could be due to severe sepsis conditions and/or to intrinsic properties of favipiravir metabolism. Dose-ranging studies should be performed in healthy volunteers to assess the concentrations and the tolerance that could be achieved with high doses.

Trial registration

ClinicalTrials.gov NCT02329054

]]>