ResearchPad - medical-dialysis https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Prevalence of anti-hepatitis E virus IgG antibodies in sera from hemodialysis patients in Tripoli, Lebanon]]> https://www.researchpad.co/article/elastic_article_15713 Hepatitis E virus (HEV) is an important global public health concern. Several studies reported a higher HEV prevalence in patients undergoing regular hemodialysis (HD). In Lebanon, the epidemiology of HEV among HD patients has never been investigated previously. In this study, we examine the seroprevalence of HEV infection among 171 HD patients recruited from three hospital dialysis units in Tripoli, North Lebanon. Prevalence of anti-HEV IgG antibodies was evaluated in participant’s sera using a commercial enzyme-linked immunosorbent assay (ELISA). The association of socio-demographic and clinical parameters with HEV infection in patients was also evaluated. Overall, 96 women and 75 men were enrolled in this study. Anti-HEV IgG antibodies were found positive in 37/171 HD patients showing a positivity rate of 21.63%. Among all examined variables, only the age of patients was significantly associated with seropositivity (P = 0.001). This first epidemiological study reveals a high seroprevalence of HEV infection among Lebanese HD patients. However, further evaluations that enroll larger samples and include control groups are required to identify exact causative factors of the important seropositivity rate in this population.

]]>
<![CDATA[The prevalence of hepatitis C virus in hemodialysis patients in Pakistan: A systematic review and meta-analysis]]> https://www.researchpad.co/article/elastic_article_14616 Hepatitis C virus (HCV) infection is one of the most common bloodborne viral infections reported in Pakistan. Frequent dialysis treatment of hemodialysis patients exposes them to a high risk of HCV infection. The main purpose of this paper is to quantify the prevalence of HCV in hemodialysis patients through a systematic review and meta-analysis.MethodsWe systematically searched PubMed, Medline, EMBASE, Pakistani Journals Online and Web of Science to identify studies published between 1 January 1995 and 30 October 2019, reporting on the prevalence of HCV infection in hemodialysis patients. Meta-analysis was performed using a random-effects model to obtain pooled estimates. A funnel plot was used in conjunction with Egger’s regression test for asymmetry and to assess publication bias. Meta-regression and subgroup analyses were used to identify potential sources of heterogeneity among the included studies. This review was registered on PROSPERO (registration number CRD42019159345).ResultsOut of 248 potential studies, 19 studies involving 3446 hemodialysis patients were included in the meta-analysis. The pooled prevalence of HCV in hemodialysis patients in Pakistan was 32.33% (95% CI: 25.73–39.30; I2 = 94.3%, p < 0.01). The subgroup analysis showed that the prevalence of HCV among hemodialysis patients in Punjab was significantly higher (37.52%; 95% CI: 26.66–49.03; I2 = 94.5, p < 0.01) than 34.42% (95% CI: 14.95–57.05; I2 = 91.3%, p < 0.01) in Baluchistan, 27.11% (95% CI: 15.81–40.12; I2 = 94.5, p < 0.01) in Sindh and 22.61% (95% CI: 17.45–28.2; I2 = 78.6, p < 0.0117) in Khyber Pukhtoonkhuwa.ConclusionsIn this study, we found a high prevalence (32.33%) of HCV infection in hemodialysis patients in Pakistan. Clinically, hemodialysis patients require more attention and resources than the general population. Preventive interventions are urgently needed to decrease the high risk of HCV infection in hemodialysis patients in Pakistan. ]]> <![CDATA[Dialysis timing may be deferred toward very late initiation: An observational study]]> https://www.researchpad.co/article/elastic_article_14499 The optimal timing to initiate dialysis among patients with an estimated glomerular filtration rate (eGFR) of <5 mL/min/1.73 m2 is unknown. We hypothesized that dialysis initiation time can be deferred in this population even with high uremic burden. A case-crossover study with case (0–30 days before dialysis initiation [DI]) and control (90–120 days before DI) periods was conducted in 1,079 hemodialysis patients aged 18–90 years at China Medical University Hospital between 2006 and 2015. The uremic burden was quantified based on 7 uremic indicators that reached the predefined threshold in case period, namely hemoglobin, serum albumin, blood urea nitrogen, serum creatinine, potassium, phosphorus, and bicarbonate. Dialysis timing was classified as standard (met 0–2 uremic indicators), late (3–5 indicators), and very late (6–7 indicators). Median eGFR-DI of the 1,079 patients was 3.4 mL/min/1.73 m2 and was 2.7 mL/min/1.73 m2 in patients with very late initiation. The median follow-up duration was 2.42 years. Antibiotics, diuretics, antihypertensive medications, and non-steroidal anti-inflammatory drugs (NSAIDs) were more prevalently used during the case period. The fully adjusted hazards ratios of all-cause mortality for the late and very late groups were 0.97 (95% confidence interval 0.76–1.24) and 0.83 (0.61–1.15) compared with the standard group. It is safe to defer dialysis initiation among patients with chronic kidney disease (CKD) having an eGFR of <5 mL/min/1.73 m2 even when patients having multiple biochemical uremic burdens. Coordinated efforts in acute infection prevention, optimal fluid management, and prevention of accidental exposure to NSAIDs are crucial to prolong the dialysis-free survival.

]]>
<![CDATA[Long-term outcomes after extracorporeal membrane oxygenation in patients with dialysis-requiring acute kidney injury: A cohort study]]> https://www.researchpad.co/article/5c92b361d5eed0c4843a3f31

Background

Acute kidney injury (AKI) is a common complication of extracorporeal membrane oxygenation (ECMO) treatment. The aim of this study was to elucidate the long-term outcomes of adult patients with AKI who receive ECMO.

Materials and methods

The study analyzed encrypted datasets from Taiwan’s National Health Insurance Research Database. The data of 3251 patients who received first-time ECMO treatment between January 1, 2003, and December 31, 2013, were analyzed. Characteristics and outcomes were compared between patients who required dialysis for AKI (D-AKI) and those who did not in order to evaluate the impact of D-AKI on long-term mortality and major adverse kidney events.

Results

Of the 3251 patients, 54.1% had D-AKI. Compared with the patients without D-AKI, those with D-AKI had higher rates of all-cause mortality (52.3% vs. 33.3%; adjusted hazard ratio [aHR] 1.82, 95% confidence interval [CI] 1.53–2.17), chronic kidney disease (13.7% vs. 8.1%; adjusted subdistribution HR [aSHR] 1.66, 95% CI 1.16–2.38), and end-stage renal disease (5.2% vs. 0.5%; aSHR 14.28, 95% CI 4.67–43.62). The long-term mortality of patients who survived more than 90 days after discharge was 22.0% (153/695), 32.3% (91/282), and 50.0% (10/20) in the patients without D-AKI, with recovery D-AKI, and with nonrecovery D-AKI who required long-term dialysis, respectively, demonstrating a significant trend (Pfor trend <0.001).

Conclusion

AKI is associated with an increased risk of long-term mortality and major adverse kidney events in adult patients who receive ECMO.

]]>
<![CDATA[Severe hyperbilirubinemia is associated with higher risk of contrast-related acute kidney injury following contrast-enhanced computed tomography]]> https://www.researchpad.co/article/N624c57d4-8983-4ece-aecc-e0e7860066cf

Introduction

Contrast-induced acute kidney injury (CI-AKI) is associated with high risks of morbidity and mortality. Hyperbilirubinemia might have some renal protection but with no clear cutoff value for protection. Related studies are typically on limited numbers of patients and only in conditions of vascular intervention.

Methods

We performed this study to elucidate CI-AKI in patients after contrast-enhanced computed tomography (CCT). The outcomes were CI-AKI, dialysis and mortality. Patients were divided to three groups based on their serum levels of total bilirubin: ≤1.2 mg/dl, 1.3–2.0 mg/dl, and >2.0 mg/dl.

Results

We enrolled a total of 9,496 patients who had received CCT. Patients with serum total bilirubin >2.0 mg/dl were associated with CI-AKI. Those undergoing dialysis had the highest incidence of PC-AKI (p<0.001). No difference was found between the two groups of total bilirubin ≤1.2 and 1.3–2.0 mg/dl. Patients with total bilirubin >2mg/dl were associated with CI-AKI (OR = 1.89, 1.53–2.33 of 95% CI), dialysis (OR = 1.40, 1.01–1.95 of 95% CI) and mortality (OR = 1.63, 1.38–1.93 of 95% CI) after adjusting for laboratory data and all comorbidities (i.e., cerebrovascular disease, coronary artery disease, peripheral arterial disease, and acute myocardial infarction, diabetes mellitus, hypertension, gastrointestinal bleeding, cirrhosis, peritonitis, ascites, hepatoma, shock lung and colon cancer). We concluded that total bilirubin level >2 mg/dl is an independent risk factor for CI-AKI, dialysis and mortality after CCT. These patients also had high risks for cirrhosis or hepatoma.

Conclusion

This is the first study providing evidence that hyperbilirubinemia (total bilirubin >2.0 mg/dl) being an independent risk factor for CI-AKI, dialysis and mortality after receiving CCT. Most patients with total bilirubin >2.0mg/dl had cirrhosis or hepatoma.

]]>
<![CDATA[Previously-initiated hemodialysis as prognostic factor for in-hospital mortality in pneumonia patients with stage 5 chronic kidney disease: Retrospective database study of Japanese hospitals]]> https://www.researchpad.co/article/5c818e8bd5eed0c484cc24db

Background

Some clinicians keep patients in stage 5 chronic kidney disease (CKD) without hemodialysis for a while. This study investigated whether previously-initiated hemodialysis in stage 5 CKD patients may become a prognostic factor for in-hospital mortality due to pneumonia.

Methods

Patient data were obtained from the multi-institutional diagnosis procedure combination database between April 1, 2012 and March 31, 2016. The patients had records of pneumonia as both trigger and major diagnoses and records of end stage renal disease (ESRD) or stage 5 CKD as a comorbidity or other diagnoses on admission and aged 18 years or older. The following factors were adjusted: age, sex, body mass index, Barthel index, orientation disturbance, arterial oxygen saturation, systolic blood pressure, C-reactive protein level or the extent of consolidation on chest radiography, ambulance use, hospitalization within 90 days, and comorbidities upon admission. The primary outcome measure was all-cause in-hospital mortality obtained via multivariable logistic regression analysis using four Models. Model 1 involved complete case analysis with overlapping; one hospitalization per patient was counted as one. Model 2 involved a complete case analysis without overlapping; only the first hospitalization per patient was counted. Model 3 involved multilevel analysis clustered by hospital codes. Model 4 was created after multiple imputation for lacking adjusted factors.

Results

A total of 907 hospitals and 7,726 patients were identified. Hemodialysis was significantly associated with lower in-hospital mortality in all models (odds ratio [OR] = 0.68, 95% confidence interval [CI]: 0.54–0.87 in Model 1; OR = 0.71, 95% CI: 0.55–0.91 in Model 2; OR = 0.67, 95% CI: 0.52–0.86 in Model 3; and OR = 0.68, 95% CI: 0.54–0.87 in Model 4).

Conclusion

Previously-initiated hemodialysis may be an independent prognostic factor for in-hospital mortality in pneumonia patients with end-stage renal disease. This should be borne in mind when considering the time of initiation of dialysis.

]]>
<![CDATA[The associations of fat tissue and muscle mass indices with all-cause mortality in patients undergoing hemodialysis]]> https://www.researchpad.co/article/5c6dc9bbd5eed0c48452a0c8

Protein-energy wasting, which involves loss of fat and muscle mass, is prevalent and is associated with mortality in hemodialysis (HD) patients. We investigated the associations of fat tissue and muscle mass indices with all-cause mortality in HD patients. The study included 162 patients undergoing HD. The fat tissue index (FTI) and skeletal muscle mass index (SMI), which represent respective tissue masses normalized to height squared, were measured by bioimpedance analysis after dialysis. Patients were divided into the following four groups according to the medians of FTI and SMI values: group 1 (G1), lower FTI and lower SMI; G2, higher FTI and lower SMI; G3, lower FTI and higher SMI; and G4, higher FTI and higher SMI. The associations of the FTI, SMI, and body mass index (BMI) with all-cause mortality were evaluated. During a median follow-up of 2.5 years, 29 patients died. The 5-year survival rates were 48.6%, 76.1%, 95.7%, and 87.4% in the G1, G2, G3, and G4 groups, respectively (P = 0.0002). The adjusted hazard ratio values were 0.34 (95% confidence interval [CI] 0.10–0.95, P = 0.040) for G2 vs. G1, 0.13 (95%CI 0.01–0.69, P = 0.013) for G3 vs. G1, and 0.25 (95%CI 0.07–0.72, P = 0.0092) for G4 vs. G1, respectively. With regard to model discrimination, on adding both FTI and SMI to a model with established risk factors, the C-index increased significantly when compared with the value for a model with BMI (0.763 vs. 0.740, P = 0.016). Higher FTI and/or higher SMI values were independently associated with reduced risks of all-cause mortality in HD patients. Moreover, the combination of the FTI and SMI may more accurately predict all-cause mortality when compared with BMI. Therefore, these body composition indicators should be evaluated simultaneously in this population.

]]>
<![CDATA[Evaluation of hemostasis in patients with end-stage renal disease]]> https://www.researchpad.co/article/5c76fe68d5eed0c484e5b9fa

An increased bleeding risk is reported for patients with end-stage renal disease. This study aims to analyze, whether bleeding risk can be assessed by global tests of hemostasis. Standard laboratory tests and an extended evaluation of hemostasis by rotational thromboelastometry, platelet function analyzer (PFA) and multiple electrode aggregometry as well as thrombin generation assays and measurement of fibrinolytic potential were performed in 20 patients on hemodialysis, 10 patients on peritoneal dialysis, 10 patients with chronic kidney disease stage G5 (CKD5) and in 10 healthy controls (HC). Hemoglobin was significantly lower in patients with end-stage renal disease versus HC (each p<0.01). Patients on peritoneal dialysis showed increased fibrinogen levels compared to HC (p<0.01), which were also reflected by FIBTEM results (each p<0.05). 41% of hemodialysis patients and 44% of CKD5 patients presented with prolonged PFA-ADP-test (p<0.05), while no patient on peritoneal dialysis and no HC offered this modification. Thrombin generating potential was significantly lower in patients on hemodialysis, while clot lysis time revealed a hypofibrinolytic state in patients on hemo- and peritoneal dialysis compared to HC (p<0.001). In conclusion, patients with end-stage renal disease have complex hemostatic changes with both hyper- and hypocoagulable features, which are dependent on use and type of dialysis. Hypercoagulable features include elevated fibrinogen levels and a hypofibrinolytic state, whereas hypocoagulable features include decreased thrombin generating capacity and platelet dysfunction. Our results may contribute to a more rational approach to hemostatic management in these patients.

]]>
<![CDATA[Outcomes and challenges of a kidney transplant programme at Groote Schuur Hospital, Cape Town: A South African perspective]]> https://www.researchpad.co/article/5c57e6afd5eed0c484ef3b68

Introduction

Access to dialysis and transplantation in the developing world remains limited. Therefore, optimising renal allograft survival is essential. This study aimed to evaluate clinical outcomes and identify poor prognostic factors in the renal transplant programme at Groote Schuur Hospital [GSH], Cape Town.     

Method

Data were collected on all patients who underwent a kidney transplant at GSH from 1st July 2010 to the 30 June 2015. Analyses were performed to assess baseline characteristics, graft and patient survival, as well as predictors of poor outcome.    

Results

198 patients were transplanted. The mean age was 38 +/- 10.5 years, 127 (64.1%) were male, and 86 (43.4%) were of African ethnicity. Deceased donor organs were used for 130 (66.7%) patients and living donors for 65 (33.3%). There were > 5 HLA mismatches in 58.9% of transplants. Sepsis was the commonest cause of death and delayed graft function [DGF] occurred in 41 (21.4%) recipients. Patient survival was 90.4% at 1 year and 83.1% at 5 years. Graft survival was 89.4% at 1 year and 80.0% at 5 years. DGF (HR 2.83 (1.12–7.19), p value = 0.028) and recipient age > 40 years (HR 3.12 (1.26–7.77), p value = 0.014) were predictors of death.

Conclusion

Despite the high infectious burden, stratified immunosuppression and limited tissue typing this study reports encouraging results from a resource constrained transplant programme in South Africa. Renal transplantation is critical to improve access to treatment of end stage kidney disease where access to dialysis is limited.

]]>
<![CDATA[Reporting of “dialysis adequacy” as an outcome in randomised trials conducted in adults on haemodialysis]]> https://www.researchpad.co/article/5c633962d5eed0c484ae65b8

Background

Clinical trials are most informative for evidence-based decision-making when they consistently measure and report outcomes of relevance to stakeholders, especially patients, clinicians, and policy makers. However, sometimes terminology used is interpreted differently by different stakeholders, which might lead to confusion during shared decision making. The construct dialysis adequacy is frequently used, suggesting it is an important outcome both for health care professionals as for patients.

Objective

To assess the scope and consistency of the construct dialysis adequacy as reported in randomised controlled trials in hemodialysis, and evaluate whether these align to the insights and understanding of this construct by patients.

Methods

To assess scope and consistency of dialysis adequacy by professionals, we performed a systematic review searching the Cochrane Central Register of Controlled Trials (CENTRAL) up to July 2017. We identified all randomised controlled trails (RCT) including patients on hemodialysis and reporting dialysis adequacy, adequacy or adequacy of dialysis and extracted and classified all reported outcomes. To explore interpretation and meaning of the construct of adequacy by patients, we conducted 11 semi-structured interviews with HD patients using thematic analysis. Belgian registration number B670201731001.

Findings

From the 31 included trials, we extracted and classified 98 outcome measures defined by the authors as adequacy of dialysis, of which 94 (95%) were biochemical, 3 (3%) non-biochemical surrogate and 2 (2%) patient-relevant. The three most commonly reported measures were all biochemical. None of the studies defined adequacy of dialysis as a patient relevant outcome such as survival or quality of life. Patients had a substantially different understanding of the construct dialysis adequacy than the biochemical interpretation reported in the literature. Being alive, time spent while being on dialysis, fatigue and friendliness of staff were the most prominent themes that patients linked to the construct of dialysis adequacy.

Conclusion

Adequacy of dialysis as reported in the literature refers to biochemical outcome measures, most of which are not related with patient relevant outcomes. For patients, adequate dialysis is a dialysis that enables them to spend as much quality time in their life as possible.

]]>
<![CDATA[Predictors of long-term prognosis in acute kidney injury survivors who require continuous renal replacement therapy after cardiovascular surgery]]> https://www.researchpad.co/article/5c5ca2cdd5eed0c48441eb4f

The long-term prognosis of patients with postoperative acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) after cardiovascular surgery is unclear. We aimed to investigate long-term renal outcomes and survival in these patients to determine the risk factors for negative outcomes. Long-term prognosis was examined in 144 hospital survivors. All patients were independent and on renal replacement therapy at hospital discharge. The median age at operation was 72.0 years, and the median pre-operative estimated glomerular filtration rate (eGFR) was 39.5 mL/min/1.73 m2. The median follow-up duration was 1075 days. The endpoints were death, chronic maintenance dialysis dependence, and a composite of death and chronic dialysis. Predictors for death and dialysis were evaluated using Fine and Gray’s competing risk analysis. The cumulative incidence of death was 34.9%, and the chronic dialysis rate was 13.3% during the observation period. In the multivariate proportional hazards analysis, eGFR <30 mL/min/1.73 m2 at discharge was associated with the composite endpoint of death and dialysis [hazard ratio (HR), 2.1; 95% confidence interval (CI), 1.1–3.8; P = 0.02]. Hypertension (HR 8.7, 95% CI, 2.2–35.4; P = 0.002) and eGFR <30 mL/min/1.73 m2 at discharge (HR 26.4, 95% CI, 2.6–267.1; P = 0.006) were associated with dialysis. Advanced age (≥75 years) was predictive of death. Patients with severe CRRT-requiring AKI after cardiovascular surgery have increased risks of chronic dialysis and death. Patients with eGFR <30 mL/min/1.73 m2 at discharge should be monitored especially carefully by nephrologists due to the risk of chronic dialysis and death.

]]>
<![CDATA[Strategies to improve dietary, fluid, dialysis or medication adherence in patients with end stage kidney disease on dialysis: A systematic review and meta-analysis of randomized intervention trials]]> https://www.researchpad.co/article/5c59feacd5eed0c4841352cc

Background

In patients with end stage kidney disease (ESKD) on dialysis, treatment non-adherence is common and results in poor health outcomes. However, the clinical benefits of interventions to improve adherence in dialysis patients are difficult to evaluate since trialled interventions and reported outcomes are highly diverse/ heterogeneous. This review summarizes existing literature on randomized controlled trials (RCTs) evaluating adherence interventions in ESKD patients focusing on the intervention category, outcome efficacy and persistence of benefit beyond the intervention.

Methods

We performed electronic database searches in Medline, Embase & Cochrane CENTRAL upto 1st July 2018 for RCTs evaluating interventions to improve diet, fluid, medication or dialysis adherence in ESKD patients. Study characteristics including category of interventions, outcomes, efficacy and follow-up were assessed. Meta-analysis was used to compute pooled estimates of the effects on the commonest reported outcome measures.

Results

From 1311 citations, we included 36 RCTs (13 cluster-randomized trials), recruiting a total of 3510 dialysis patients (mean age 55.1 ± 5.8 years, males 58.1%). Overall risk of bias was ‘high’ for 24 and of ‘some concern’ for 12 studies. Most interventions (33 trials, 92%) addressed patient related factors, and included educational/cognitive (N = 11), behavioural / counselling (N = 4), psychological/affective (N = 4) interventions or a combination (N = 14) of the above. A majority of (28/36) RCTs showed improvement in some reported outcomes. Surrogate measures like changes in phosphate (N = 19) and inter-dialytic weight gain (N = 15) were the most common reported outcomes and both showed significant improvement in the meta-analysis. Sixteen trials reported follow-up (1–12 months) beyond intervention and the benefits waned or were absent in nine trials within 12 months post-intervention.

Conclusions

Interventions to improve treatment adherence result in modest short-term benefits in surrogate outcome measures in dialysis patients, but significant improvements in trial design and outcome reporting are warranted to identify strategies that would achieve meaningful and sustainable clinical benefits.

Limitations

Poor methodological quality of trials. Frequent use of surrogate outcomes measures. Low certainly of evidence.

]]>
<![CDATA[Patency rates of arteriovenous fistulas created before versus after hemodialysis initiation]]> https://www.researchpad.co/article/5c58d614d5eed0c484031566

In an incident hemodialysis (HD) population, we aimed to investigate whether arteriovenous fistula (AVF) creation before HD initiation was associated with improved AVF patency compared with AVF creation from a central venous catheter (CVC), and also to compare patient survival between these patients. Between January 2011 and December 2013, 524 incident HD patients with identified first predialysis vascular access with an AVF (pre-HD group, n = 191) or an AVF from a CVC (on-HD group, n = 333) were included and analyzed retrospectively. The study outcome was defined as AVF patency and all-cause mortality (time to death). On Kaplan–Meier survival analysis, primary and secondary AVF patency rates did not differ significantly between the two groups (P = 0.812 and P = 0.586, respectively), although the overall survival rate was significantly higher in the pre-HD group compared with the on-HD group (P = 0.013). On multivariate analysis, well-known patient factors were associated with decreased primary (older age and diabetes mellitus [DM]) and secondary (DM and peripheral arterial occlusive disease) AVF patency, whereas use of a CVC as the initial predialysis access (hazard ratios, 1.84; 95% confidence intervals, 1.20–2.75; P = 0.005) was significantly associated with worse survival in addition to well-known patient factors (older age, diabetes mellitus, and peripheral arterial occlusive disease). Worse survival in the on-HD group was likely confounded by selection bias because of the retrospective nature of our study. Therefore, the observed lower mortality associated with AVF creation before HD initiation is not fully attributable to CVC use, but rather, affected by other patient-level prognostic factors. There were no CVC-related complications in the pre-HD group, whereas 10.2% of CVC-related complications were noted in the on-HD group. In conclusion, among incident HD patients, compared with patients who underwent creation of an AVF from a CVC, initial AVF creation showed similar primary and secondary AVF patency rates, but lower mortality risk. We also observed that an initial CVC use was an independent risk factor associated with worse survival. A fistula-first strategy might be the best option for incident HD patients who are good candidates for AVF creation.

]]>
<![CDATA[The association of the difference in hemoglobin levels before and after hemodialysis with the risk of 1-year mortality in patients undergoing hemodialysis. Results from a nationwide cohort study of the Japanese Renal Data Registry]]> https://www.researchpad.co/article/5c40f767d5eed0c4843860a0

Background

Few clinical studies have directly examined the associations of hemoglobin (Hb) levels after hemodialysis (HD) and of the difference in Hb levels before and after HD (ΔHb) with patient outcomes. The present study aimed to determine ΔHb and post-HD Hb levels with nationwide data and to examine their associations with all-cause mortality in patients undergoing HD.

Methods

This study is based on data from 2008 and 2009 recorded in the Japanese Renal Data Registry. Study endpoints were all-cause mortality within 1-year. The ΔHb and post-HD Hb level as categorical variables using Cox regression for 1-year mortality, adjusting for potential confounders.

Results

The median ΔHb was 1.0 g/dl, and the post-HD Hb level was 11.3 g/d. The median pre-HD Hb level was 10.4 g/dl. The risk of mortality was lower with a ΔHb of 0 to 1.0 g/dl (adjusted hazard ratio [aHR], 0.90; 95% confidence interval [CI], 0.70–1.01) or > 1.0 g/dl (aHR, 0.73; 95% CI, 0.64–0.84) than with a ΔHb < 0 g/dl. The risk for mortality was also lower with a post-HD Hb of 10 to 11 g/dl (aHR, 0.82; 95% CI, 0.73–0.92), 11 to 12 g/dl (aHR, 0.77; 95% CI, 0.68–0.87), or > 12 g/dl (aHR, 0.77; 95% CI, 0.68–0.87) than with a post-HD Hb < 10 g/dl.

Conclusions

Both a low ΔHb and a low post-HD Hb level were associated with a higher risk of 1-year mortality.

]]>
<![CDATA[Changes in QTc interval in long-term hemodialysis patients]]> https://www.researchpad.co/article/5c37b7b1d5eed0c48449098e

Background

Cardiovascular diseases, including sudden cardiac death (SCD), are the leading cause of death in hemodialysis (HD) patients. A prolonged QT interval on the electrocardiogram (ECG) is a risk factor for SCD in HD patients. This study investigated whether the heart rate-corrected QT (QTc) interval becomes prolonged along with dialysis vintage.

Methods

A total of 102 HD patients were retrospectively studied. Their ECG data were analyzed at 1, 4, and 7 years after HD initiation. The control group comprised 68 age-matched individuals who had normal renal function and two available ECG reports at an interval of more than 4 years. QTc was measured according to the Bazett formula. The association between QTc interval and dialysis vintage was studied. Additionally, clinically relevant variables related to QTc duration at 1 year after HD initiation were assessed.

Results

Average QTc interval at 4 and 7 years after HD initiation was significantly longer than that at 1 year after HD initiation (443, 445, and 437 ms) (p<0.05). On the other hand, QTc interval in the control group was 425 ms in the first year and 426 ms after an average of 6 years. They had no significant differences, although they were much shorter than that in HD patients. Multivariate regression analysis of baseline variables revealed that the corrected calcium levels (p = 0.041) and diabetes (p = 0.043) were independently associated with longer QTc interval.

Conclusions

The QTc interval at 1 year after HD initiation was longer than in the control subjects and was prolonged over several years of HD treatment. Providing clinical management with a focus on QTc interval may be helpful for reducing the incidence of SCD in HD patients.

]]>
<![CDATA[Kidney-inspired algorithm with reduced functionality treatment for classification and time series prediction]]> https://www.researchpad.co/article/5c390bfed5eed0c48491f4af

Optimization of an artificial neural network model through the use of optimization algorithms is the common method employed to search for an optimum solution for a broad variety of real-world problems. One such optimization algorithm is the kidney-inspired algorithm (KA) which has recently been proposed in the literature. The algorithm mimics the four processes performed by the kidneys: filtration, reabsorption, secretion, and excretion. However, a human with reduced kidney function needs to undergo additional treatment to improve kidney performance. In the medical field, the glomerular filtration rate (GFR) test is used to check the health of kidneys. The test estimates the amount of blood that passes through the glomeruli each minute. In this paper, we mimic this kidney function test and the GFR result is used to select a suitable step to add to the basic KA process. This novel imitation is designed for both minimization and maximization problems. In the proposed method, depends on GFR test result which is less than 15 or falls between 15 and 60 or is more than 60 a particular action is performed. These additional processes are applied as required with the aim of improving exploration of the search space and increasing the likelihood of the KA finding the optimum solution. The proposed method is tested on test functions and its results are compared with those of the basic KA. Its performance on benchmark classification and time series prediction problems is also examined and compared with that of other available methods in the literature. In addition, the proposed method is applied to a real-world water quality prediction problem. The statistical analysis of all these applications showed that the proposed method had a ability to improve the optimization outcome.

]]>
<![CDATA[Association between post-transplant serum uric acid levels and kidney transplantation outcomes]]> https://www.researchpad.co/article/5c1d5b90d5eed0c4846ebf53

Background

Serum uric acid (UA) level has been reported to be associated with chronic allograft nephropathy and graft failure in patients who undergo kidney transplantation (KT). However, the role of serum UA level in renal graft survival remains controversial.

Objective

This study aimed to investigate the effect of mean serum UA level during two different post-KT periods on long-term renal graft outcomes in a large population cohort in which living donor KT prevails.

Material and methods

A retrospective cohort study was performed using KT data prospectively collected at a single institution. Patients (n = 2,993) were divided into low-, normal-, and high-UA groups according to the mean serum UA level within the first year (1-YR) and 1–5 years (5-YR) after transplantation.

Results

In the 1-YR Cox proportional hazards analysis, the low- and high-UA groups had a significantly decreased and increased risk, respectively, for overall graft failure (OGF), death-censored graft failure (DCGF), and composite event (return to dialysis, retransplantation, death from graft dysfunction, and 40% decline in estimated glomerular filtration rate) compared with the normal-UA group. Similarly, in the 5-YR analysis, the low-UA group had a significantly reduced risk of DCGF compared with the normal-UA group, whereas the high-UA group had a significantly increased risk of all three graft outcomes. In a marginal structural model, hyperuricemia had a significant causal effect on worsening graft outcomes, with consideration of all confounding variables (OGF: hazard ratio [HR] 2.27, 95% confidence interval [CI] 1.33–3.78; DCGF: HR 2.38, 95% CI 1.09–4.9; composite event: HR 3.05, 95% CI 1.64–5.49).

Conclusions

A low-to-normal serum UA level within the first year and 1–5 years after KT is an independent factor for better renal allograft outcomes in the long-term follow-up period rather than high serum UA level.

]]>
<![CDATA[Predictive factors of Clostridioides difficile infection in hospitalized patients with new diarrhea: A retrospective cohort study]]> https://www.researchpad.co/article/5c117b5ad5eed0c484698cef

Introduction and objective

Diagnostic testing for Clostridioides difficile infection (CDI) by nucleic acid amplification test (NAAT) cannot distinguish between colonization and infection. A positive NAAT may therefore represent a false positive for infection, since diarrhea due to various aetiologies may occur in hospitalized patients. Our objective was to help answer the question: “does this medical inpatient with diarrhea have CDI?”

Design

We conducted a retrospective cohort study (n = 248) on the Clinical Teaching Units of the Royal Victoria Hospital (Montréal, Canada). Patients were included if they had a NAAT between January 2014 and September 2015 and their admission diagnosis was not CDI. CDI cases and non-CDI cases were compared, and independent predictors of CDI were determined by logistic regression.

Results

Several factors were independently associated with CDI, including: hemodialysis (OR: 13.5, 95% CI: 2.85–63.8), atrial fibrillation (OR: 3.70, 95% CI: 1.52–9.01), whether the patient received empiric treatment (OR: 3.01, 95% CI: 1.04–8.68), systemic antibiotic therapy prior to testing (OR: 4.23, 95% CI: 1.71–10.5), previous positive NAAT (OR: 3.70, 95% CI: 1.41–9.72), and a leukocyte count of 11x109/L or higher (OR: 3.43, 95% CI: 1.42–8.26). The area under the curve was 0.80.

Conclusion

For patients presenting with hospital-onset diarrhea, various parameters can help differentiate between CDI and other causes. A clinical prediction calculator derived from our cohort (http://individual.utoronto.ca/leet/cdiff.html) might assist clinicians in estimating the risk of CDI for inpatients; those with low pre-test probability may not require immediate testing, treatment, nor prolonged isolation.

]]>
<![CDATA[Impact of hemodialysis on cardiovascular system assessed by pulse wave analysis]]> https://www.researchpad.co/article/5be5fb88d5eed0c484f6cf3f

Valuable information about cardiovascular system can be derived from the shape of aortic pulse wave being the result of reciprocal interaction between heart and vasculature. Pressure profiles in ascending aorta were obtained from peripheral waveforms recorded non-invasively (SphygmoCor, AtCor Medical, Australia) before, during and after hemodialysis sessions performed after 3-day and 2-day interdialytic intervals in 35 anuric, prevalent hemodialysis patients. Fluid status was assessed by Body Composition Monitor (Fresenius Medical Care, Bad Homburg, Germany) and online hematocrit monitoring device (CritLine, HemaMetrics, Utah). Systolic pressure and ejection duration decreased during dialysis. Augmentation index remained stable at 30 ± 13% throughout hemodialysis session despite the decrease of augmented pressure and pulse height. Subendocardial viability ratio (SEVR) determined after 3-day and 2-day interdialytic intervals increased during the sessions by 43.8 ± 26.6% and 26.1 ± 25.4%, respectively. Hemodialysis performed after 3-day and 2-day interdialytic periods reduced significantly overhydration by 2.4 ± 1.0 L and 1.8 ± 1.2 L and blood volume by 16.3 ± 9.7% and 13.7 ± 8.9%, respectively. Intradialytic increase of SEVR correlated with ultrafiltration rate (R = 0.39, p-value < 0.01), reduction in overhydration (R = -0.57, p-value < 0.001) and blood volume drop (R = -0.38, p-value < 0.01). The strong correlation between the decrease of overhydration during hemodialysis and increase in SEVR confirmed that careful fluid management is crucial for proper cardiac function. Hemodialysis affected cardiovascular system with the parameters derived from pulse-wave-analysis (systolic and augmented pressures, pulse height, ejection duration, SEVR) being significantly different at the end of dialysis from those before the session. Combination of pulse-wave-analysis with the monitoring of overhydration provides a new insight into the impact of hemodialysis on cardiovascular system.

]]>
<![CDATA[Baroreflex function, haemodynamic responses to an orthostatic challenge, and falls in haemodialysis patients]]> https://www.researchpad.co/article/5c12cf76d5eed0c48491469a

Background

Stage 5 chronic kidney disease patients on haemodialysis (HD) often present with dizziness and pre-syncopal events as a result of the combined effect of HD therapy and cardiovascular disease. The dysregulation of blood pressure (BP) during orthostasis may be implicated in the aetiology of falls in these patients. Therefore, we explored the relationship between baroreflex function, the haemodynamic responses to a passive orthostatic challenge, and falls in HD patients.

Methods

Seventy-six HD patients were enrolled in this cross-sectional study. Participants were classified as “fallers” and “non-fallers” and completed a passive head up tilting to 60o (HUT-60°) test on an automated tilt table. ECG signals, continuous and oscillometric BP measurements and impedance cardiography were recorded. The following variables were derived from these measurements: heart rate (HR) stroke volume (SV), cardiac output (CO), total peripheral resistance (TPR), number of baroreceptor events, and baroreceptor effectiveness index (BEI).

Results

The forty-four participants who were classified as fallers (57.9%) had a lower number of baroreceptor events (6.5±8.5 vs 14±16.7, p = .027) and BEI (20.8±24.2% vs 33.4±23.3%, p = .025). In addition, fallers experienced a significantly larger drop in systolic (-6.4±10.9 vs -0.4±7.7 mmHg, p = .011) and diastolic (-2.7±7.3 vs 1.8±6 mmHg, p = .027) oscillometric BP from supine to HUT-60° compared with non-fallers. None of the variables taken for the analysis were significantly associated with falls in multivariate logistic regression analysis.

Conclusions

This cross-sectional comparison indicates that, at rest, HD patients with a positive history of falls present with a lower count of baroreceptor sequences and BEI.

Short-term BP regulation warrants further investigation as BP drops during a passive orthostatic challenge may be implicated in the aetiology of falls in HD.

]]>