ResearchPad - organ-transplantation https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[A pilot study of ex-vivo MRI-PDFF of donor livers for assessment of steatosis and predicting early graft dysfunction]]> https://www.researchpad.co/article/elastic_article_14544 The utility of ex vivo Magnetic resonance imaging proton density fat fraction (MRI-PDFF) in donor liver fat quantification is unknown.PurposeTo evaluate the diagnostic accuracy and utility in predicting early allograft dysfunction (EAD) of ex vivo MRI-PDFF measurement of fat in deceased donor livers using histology as the gold standard.MethodsWe performed Ex vivo, 1.5 Tesla MRI-PDFF on 33 human deceased donor livers before implantation, enroute to the operating room. After the exclusion of 4 images (technical errors), 29 MRI images were evaluable. Histology was evaluable in 27 of 29 patients. EAD was defined as a peak value of aminotransferase >2000 IU/mL during the first week or an INR of ≥1.6 or bilirubin ≥10 mg/dL at day 7.ResultsMRI-PDFF values showed a strong positive correlation (Pearson’s correlation coefficient) when histology (macro-steatosis) was included (r = 0.78, 95% confidence interval 0.57‐0.89, p<0.0001). The correlation appeared much stronger when macro plus micro-steatosis were included (r = 0.87, 95% confidence interval 0.72‐0.94, p<0.0001). EAD was noted in 7(25%) subjects. AUC (Area Under the Curve) for macro steatosis (histology) predicted EAD in 73% (95% CI: 48–99), micro plus macro steatosis in 76% (95% CI: 49–100). AUC for PDFF values predicted EAD in 67(35–98). Comparison of the ROC curves in a multivariate model revealed, adding MRI PDFF values to macro steatosis increased the ability of the model in predicting EAD (AUC: 79%, 95% CI: 59–99), and addition of macro plus micro steatosis based on histology predicted EAD even better (AUC: 90%: 79–100, P = 0.054).ConclusionIn this pilot study, MRI-PDFF imaging showed potential utility in quantifying hepatic steatosis ex-vivo donor liver evaluation and the ability to predict EAD related to severe allograft steatosis in the recipient. ]]> <![CDATA[Risk of Graft Failure in Kidney Recipients with Cured Post-Transplant Cancer]]> https://www.researchpad.co/article/elastic_article_13974

]]>
<![CDATA[Improvement of steatotic rat liver function with a defatting cocktail during <i>ex situ</i> normothermic machine perfusion is not directly related to liver fat content]]> https://www.researchpad.co/article/elastic_article_13803 There is a significant organ shortage in the field of liver transplantation, partly due to a high discard rate of steatotic livers from donors. These organs are known to function poorly if transplanted but make up a significant portion of the available pool of donated livers. This study demonstrates the ability to improve the function of steatotic rat livers using a combination of ex situ machine perfusion and a “defatting” drug cocktail. After 6 hours of perfusion, defatted livers demonstrated lower perfusate lactate levels and improved bile quality as demonstrated by higher bile bicarbonate and lower bile lactate. Furthermore, defatting was associated with decreased gene expression of pro-inflammatory cytokines and increased expression of enzymes involved in mitochondrial fatty acid oxidation. Rehabilitation of marginal or discarded steatotic livers using machine perfusion and tailored drug therapy can significantly increase the supply of donor livers for transplantation.

]]>
<![CDATA[Therapeutic Effects of Mesenchymal Stem Cells on a Stevens-Johnson Syndrome/Toxic Epidermal Necrolysis Model]]> https://www.researchpad.co/article/Nab345a18-2a00-473d-9dd2-5565f0d0bdc2

Stevens-Johnson syndrome and toxic epidermal necrolysis (SJS/TEN) are the most severe cutaneous drug hypersensitivity reactions, which are unpredictable adverse drug reactions. SJS/TEN is associated with significant mortality and morbidity; however, effective treatment is difficult. Mesenchymal stem cells (MSCs) are well-known for their anti-inflammatory and tissue regeneration properties. The purpose of the present study was to verify whether MSCs could be applied for the treatment of SJS/TEN. We developed an SJS/TEN mouse model using peripheral blood mononuclear cells from a lamotrigine-induced SJS patient. MSCs were injected into the model to verify the treatment effect. In SJS model mice treated with MSCs, ocular damage rarely occurred, and apoptosis rate was significantly lower. We demonstrated a therapeutic effect of MSCs on SJS/TEN, with these cells presenting a potential novel therapy for the management of this disorder.

]]>
<![CDATA[Drug induced pancreatitis: A systematic review of case reports to determine potential drug associations]]> https://www.researchpad.co/article/N8c4889da-2d62-467b-8b66-b271dff2cf8d

Objective

A current assessment of case reports of possible drug-induced pancreatitis is needed. We systematically reviewed the case report literature to identify drugs with potential associations with acute pancreatitis and the burden of evidence supporting these associations.

Methods

A protocol was developed a priori (PROSPERO CRD42017060473). We searched MEDLINE, Embase, the Cochrane Library, and additional sources to identify cases of drug-induced pancreatitis that met accepted diagnostic criteria of acute pancreatitis. Cases caused by multiple drugs or combination therapy were excluded. Established systematic review methods were used for screening and data extraction. A classification system for associated drugs was developed a priori based upon the number of cases, re-challenge, exclusion of non-drug causes of acute pancreatitis, and consistency of latency.

Results

Seven-hundred and thirteen cases of potential drug-induced pancreatitis were identified, implicating 213 unique drugs. The evidence base was poor: exclusion of non-drug causes of acute pancreatitis was incomplete or poorly reported in all cases, 47% had at least one underlying condition predisposing to acute pancreatitis, and causality assessment was not conducted in 81%. Forty-five drugs (21%) were classified as having the highest level of evidence regarding their association with acute pancreatitis; causality was deemed to be probable or definite for 19 of these drugs (42%). Fifty-seven drugs (27%) had the lowest level of evidence regarding an association with acute pancreatitis, being implicated in single case reports, without exclusion of other causes of acute pancreatitis.

Discussion

Much of the case report evidence upon which drug-induced pancreatitis associations are based is tenuous. A greater emphasis on exclusion of all non-drug causes of acute pancreatitis and on quality reporting would improve the evidence base. It should be recognized that reviews of case reports, are valuable scoping tools but have limited strength to establish drug-induced pancreatitis associations.

Registration

CRD42017060473.

]]>
<![CDATA[Nocturnal hypercapnia with daytime normocapnia in patients with advanced pulmonary arterial hypertension awaiting lung transplantation]]> https://www.researchpad.co/article/Nacc6463a-eb28-4f4a-acf0-c81fc9df01f4

Background

Pulmonary arterial hypertension (PAH) is frequently complicated by sleep disordered breathing (SDB), and previous studies have largely focused on hypoxemic SDB. Even though nocturnal hypercapnia was shown to exacerbate pulmonary hypertension, the clinical significance of nocturnal hypercapnia among PAH patients has been scarcely investigated.

Method

Seventeen patients with PAH were identified from 246 consecutive patients referred to Kyoto University Hospital for the evaluation of lung transplant registration from January 2010 to December 2017. Included in this study were 13 patients whose nocturnal transcutaneous carbon dioxide partial pressure (PtcCO2) monitoring data were available. Nocturnal hypercapnia was diagnosed according to the guidelines of the American Academy of Sleep Medicine. Associations of nocturnal PtcCO2 measurements with clinical features, the findings of right heart catheterization and pulmonary function parameters were evaluated.

Results

Nocturnal hypercapnia was diagnosed in six patients (46.2%), while no patient had daytime hypercapnia. Of note, nocturnal hypercapnia was found for 5 out of 6 patients with idiopathic PAH (83.3%). Mean nocturnal PtcCO2 levels correlated negatively with the percentage of predicted total lung capacity (TLC), and positively with cardiac output and cardiac index.

Conclusion

Nocturnal hypercapnia was prevalent among advanced PAH patients who were waiting for lung transplantation, and associated with %TLC. Nocturnal hypercapnia was associated with the increase in cardiac output, which might potentially worsen pulmonary hypertension especially during sleep. Further studies are needed to investigate hemodynamics during sleep and to clarify whether nocturnal hypercapnia can be a therapeutic target for PAH patients.

]]>
<![CDATA[Regeneration of esophagus using a scaffold-free biomimetic structure created with bio-three-dimensional printing]]> https://www.researchpad.co/article/5c8c1978d5eed0c484b4d71e

Various strategies have been attempted to replace esophageal defects with natural or artificial substitutes using tissue engineering. However, these methods have not yet reached clinical application because of the high risks related to their immunogenicity or insufficient biocompatibility. In this study, we developed a scaffold-free structure with a mixture of cell types using bio-three-dimensional (3D) printing technology and assessed its characteristics in vitro and in vivo after transplantation into rats. Normal human dermal fibroblasts, human esophageal smooth muscle cells, human bone marrow-derived mesenchymal stem cells, and human umbilical vein endothelial cells were purchased and used as a cell source. After the preparation of multicellular spheroids, esophageal-like tube structures were prepared by bio-3D printing. The structures were matured in a bioreactor and transplanted into 10-12-week-old F344 male rats as esophageal grafts under general anesthesia. Mechanical and histochemical assessment of the structures were performed. Among 4 types of structures evaluated, those with the larger proportion of mesenchymal stem cells tended to show greater strength and expansion on mechanical testing and highly expressed α-smooth muscle actin and vascular endothelial growth factor on immunohistochemistry. Therefore, the structure with the larger proportion of mesenchymal stem cells was selected for transplantation. The scaffold-free structures had sufficient strength for transplantation between the esophagus and stomach using silicon stents. The structures were maintained in vivo for 30 days after transplantation. Smooth muscle cells were maintained, and flat epithelium extended and covered the inner surface of the lumen. Food had also passed through the structure. These results suggested that the esophagus-like scaffold-free tubular structures created using bio-3D printing could hold promise as a substitute for the repair of esophageal defects.

]]>
<![CDATA[Influence of donor liver telomere and G-tail on clinical outcome after living donor liver transplantation]]> https://www.researchpad.co/article/5c8acccbd5eed0c484990005

It has been reported that donor age affects patient outcomes after liver transplantation, and that telomere length is associated with age. However, to our knowledge, the impact of donor age and donor liver telomere length in liver transplantation has not been well investigated. This study aimed to clarify the influence of the length of telomere and G-tail from donor livers on the outcomes of living donors and recipients after living donor liver transplantation. The length of telomere and G-tail derived from blood samples and liver tissues of 55 living donors, measured using the hybridization protection assay. The length of telomeres from blood samples was inversely correlated with ages, whereas G-tail length from blood samples and telomere and G-tail lengths from liver tissues were not correlated with ages. Age, telomere, and G-tail length from blood did not affect postoperative liver failure and early liver regeneration of donors. On the other hand, the longer the liver telomere, the poorer the liver regeneration tended to be, especially with significant difference in donor who underwent right hemihepatectomy. We found that the survival rate of recipients who received liver graft with longer telomeres was inferior to that of those who received liver graft with shorter ones. An elderly donor, longer liver telomere, and higher Model for End-Stage Liver Disease score were identified as independent risk factors for recipient survival after transplantation. In conclusion, telomere shortening in healthy liver does not correlate with age, whereas longer liver telomeres negatively influence donor liver regeneration and recipient survival after living donor liver transplantation. These results can direct future studies and investigations on telomere shortening in the clinical and experimental transplant setting.

]]>
<![CDATA[Early predictors of outcomes of hospitalization for cirrhosis and assessment of the impact of race and ethnicity at safety-net hospitals]]> https://www.researchpad.co/article/5c897747d5eed0c4847d28e2

Background

Safety-net hospitals provide care for racially/ethnically diverse and disadvantaged urban populations. Their hospitalized patients with cirrhosis are relatively understudied and may be vulnerable to poor outcomes and racial/ethnic disparities.

Aims

To examine the outcomes of patients with cirrhosis hospitalized at regionally diverse safety-net hospitals and the impact of race/ethnicity.

Methods

A study of patients with cirrhosis hospitalized at 4 safety-net hospitals in 2012 was conducted. Demographic, clinical factors, and outcomes were compared between centers and racial/ethnic groups. Study endpoints included mortality and 30-day readmission.

Results

In 2012, 733 of 1,212 patients with cirrhosis were hospitalized for liver-related indications (median age 55 years, 65% male). The cohort was racially diverse (43% White, 25% black, 22% Hispanic, 3% Asian) with cirrhosis related to alcohol and viral hepatitis in 635 (87%) patients. Patients were hospitalized mainly for ascites (35%), hepatic encephalopathy (20%) and gastrointestinal bleeding (GIB) (17%). Fifty-four (7%) patients died during hospitalization and 145 (21%) survivors were readmitted within 30 days. Mortality rates ranged from 4 to 15% by center (p = .007) and from 3 to 10% by race/ethnicity (p = .03), but 30-day readmission rates were similar. Mortality was associated with Model for End-stage Liver Disease (MELD), acute-on-chronic liver failure, hepatocellular carcinoma, sodium and white blood cell count. Thirty-day readmission was associated with MELD and Charlson Comorbidity Index >4, with lower risk for GIB. We did not observe geographic or racial/ethnic differences in hospital outcomes in the risk-adjusted analysis.

Conclusions

Hospital mortality and 30-day readmission in patients with cirrhosis at safety-net hospitals are associated with disease severity and comorbidities, with lower readmissions in patients admitted for GIB. Despite geographic and racial/ethnic differences in hospital mortality, these factors were not independently associated with mortality.

]]>
<![CDATA[Treatment with mononuclear cell populations improves post-infarction cardiac function but does not reduce arrhythmia susceptibility]]> https://www.researchpad.co/article/5c6f1496d5eed0c48467a361

Background

Clinical and experimental data give evidence that transplantation of stem and progenitor cells in myocardial infarction could be beneficial, although the underlying mechanism has remained elusive. Ventricular tachyarrhythmia is the most frequent and potentially lethal complication of myocardial infarction, but the impact of mono nuclear cells on the incidence of ventricular arrhythmia is still not clear.

Objective

We aimed to characterize the influence of splenic mononuclear cell populations on ventricular arrhythmia after myocardial infarction.

Methods

We assessed electrical vulnerability in vivo in mice with left ventricular cryoinfarction 14 days after injury and intramyocardial injection of specific subpopulations of mononuclear cells (MNCs) (CD11b-positive cells, Sca-1-positive cells, early endothelial progenitor cells (eEPCs)). As positive control group we used embryonic cardiomyocytes (eCMs). Epicardial mapping was performed for analysing conduction velocities in the border zone. Left ventricular function was quantified by echocardiography and left heart catheterization.

Results

In vivo pacing protocols induced ventricular tachycardia (VT) in 30% of non-infarcted mice. In contrast, monomorphic or polymorphic VT could be evoked in 94% of infarcted and vehicle-injected mice (p<0.01). Only transplantation of eCMs prevented post-infarction VT and improved conduction velocities in the border zone in accordance to increased expression of connexin 43. Cryoinfarction resulted in a broad aggravation of left ventricular function. All transplanted cell types augmented left ventricular function to a similar extent.

Conclusions

Transplantation of different MNC populations after myocardial infarction improves left ventricular function similar to effects of eCMs. Prevention of inducible ventricular arrhythmia is only seen after transplantation of eCMs.

]]>
<![CDATA[Correlation between the native lung volume change and postoperative pulmonary function after single lung transplantation for lymphangioleiomyomatosis: Evaluation of lung volume by three-dimensional computed tomography volumetry]]> https://www.researchpad.co/article/5c6b26b2d5eed0c484289ec2

Purpose

Whereas native lung overinflation has been thought to happen in recipients of single lung transplantation for lymphangioleiomyomatosis because of its increased compliance, there is no study that has reported the details on the change of the native lung volume after single lung transplantation by three-dimensional computed tomography volumetry. The purpose of the present study was to evaluate the lung volume after single lung transplantation for lymphangioleiomyomatosis by three-dimensional computed tomography volumetry and investigate the correlation between the native lung volume change and postoperative pulmonary function.

Methods

We retrospectively reviewed the data of 17 patients who underwent single lung transplantation for lymphangioleiomyomatosis. We defined the ratio of the native lung volume to total lung volume (N/T ratio) as an indicator of overinflation of the native lung. In order to assess changes in the N/T ratio over time, we calculated the rate of change in the N/T ratio which is standardized by the N/T ratio at 1 year after single lung transplantation: rate of change in N/T ratio (%) = {(N/T ratio at a certain year)/(N/T ratio at 1 year)– 1}× 100.

Results

We investigated the correlations between the N/T ratio and the pulmonary function test parameters at 1 year and 5 years; however, there was no significant correlation between them. On the other hand, there was a significant negative correlation between the rate of change in the N/T ratio and that in forced expiratory volume in 1 second %predicted (%FEV1) at 5 years after single lung transplantation.

Conclusion

The single lung transplantation recipients for lymphangioleiomyomatosis showed increased rate of change in the N/T ratio in the long-time course after lung transplantation with the decrease of %FEV1. We expect that these cases will probably cause the overinflation of the native lung in the future.

]]>
<![CDATA[Combining liver stiffness with hyaluronic acid provides superior prognostic performance in chronic hepatitis C]]> https://www.researchpad.co/article/5c6b26b4d5eed0c484289ee1

Background

Non-invasive methods are the first choice for liver fibrosis evaluation in chronic liver diseases, but few studies investigate the ability of combined methods to predict outcomes.

Methods

591 chronic hepatitis C patients with baseline liver stiffness (LSM) by FibroScan and hyaluronic acid measurements were identified retrospectively. The patients were grouped by baseline LSM: < 10kPa, 10–16.9kPa, and 17-75kPa. Primary outcomes were all-cause mortality and liver-related mortality, analyzed using cox regression and competing risk regression models, respectively.

Results

Median follow-up was 46.1 months. Prevalence of cirrhosis at baseline was 107/591 (18.1%). Median LSM was 6.8kPa (IQR 5.3–11.6) and divided into groups, 404/591 (68.4%) had a LSM < 10kPa, 100/591 (16.9%) had a LSM between 10–16.9kPa and 87/591 (14.7%) had a LSM between 17-75kPa. There were 69 deaths, 27 from liver-related disease. 26 patients developed cirrhosis and 30 developed complications of cirrhosis. The mortality rate in the 17-75kPa group was 9.7/100 person-years, compared to 2.2/100 person-years and 1.1/100 person-years in the 10–16.9kPa and <10kPa groups (p<0.005). Liver-related mortality increased 10-fold for each group (p<0.005). Cirrhotic complications occurred almost exclusively in the 17-75kPa group, with an incidence of 10.3/100 person-years, compared to 1.8/100 person-years and 0.2/100 person-years in the 10–16.9kPa and <10kPa groups (p<0.005). Median hyaluronic acid in the 17-75kPa group was approximately 200ng/mL. Patients with a LSM 17-75kPa had significantly higher risks of death, liver-related death, and complications to cirrhosis if their hyaluronic acid measurement was more than or equal to 200ng/mL at baseline, with hazard ratios of 3.25 (95% CI 1.48–7.25), 7.7 (95% CI 2.32–28), and 3.2 (95% CI 1.35–7.39), respectively.

Conclusions

The combination of LSM and circulating hyaluronic acid measurements significantly improved prognostic ability, relative to LSM alone. Combined static and dynamic markers of liver fibrosis could provide superior risk prediction.

]]>
<![CDATA[Desensitization and treatment with APRIL/BLyS blockade in rodent kidney transplant model]]> https://www.researchpad.co/article/5c67305fd5eed0c484f37a27

Alloantibody represents a significant barrier in kidney transplant through the sensitization of patients prior to transplant through antibody mediated rejection (ABMR). APRIL BLyS are critical survival factors for mature B lymphocytes plasma cells, the primary source of alloantibody. We examined the effect of APRIL/BLyS blockade via TACI-Ig (Transmembrane activator calcium modulator cyclophilin lig interactor-Immunoglobulin) in a preclinical rodent model as treatment for both desensitization ABMR. Lewis rats were sensitized with Brown Norway (BN) blood for 21 days. Following sensitization, animals were then sacrificed or romized into kidney transplant (G4, sensitized transplant control); desensitization with TACI-Ig followed by kidney transplant (G5, sensitized + pre-transplant TACI-Ig); kidney transplant with post-transplant TACI-Ig for 21 days (G6, sensitized + post-transplant TACI-Ig); desensitization with TACI-Ig followed by kidney transplant post-transplant TACI-Ig for 21 days (G7, sensitized + pre- post-transplant TACI-Ig). Animals were sacrificed on day 21 post-transplant tissues were analyzed using flow cytometry, IHC, ELISPOT, RT-PCR. Sensitized animals treated with APRIL/BLyS blockade demonstrated a significant decrease in marginal zone non-switched B lymphocyte populations (p<0.01). Antibody secreting cells were also significantly reduced in the sensitized APRIL/BLyS blockade treated group. Post-transplant APRIL/BLyS blockade treated animals were found to have significantly less C4d deposition less ABMR as defined by Banff classification when compared to groups receiving APRIL/BLyS blockade before transplant or both before after transplant (p<0.0001). The finding of worse ABMR in groups receiving APRIL/BLyS blockade before both before after transplant may indicate that B lymphocyte depletion in this setting also resulted in regulatory lymphocyte depletion resulting in a worse rejection. Data presented here demonstrates that the targeting of APRIL BLyS can significantly deplete mature B lymphocytes, antibody secreting cells, effectively decrease ABMR when given post-transplant in a sensitized animal model.

]]>
<![CDATA[The association of preoperative cardiac stress testing with 30-day death and myocardial infarction among patients undergoing kidney transplantation]]> https://www.researchpad.co/article/5c5df353d5eed0c48458115b

Background

Although periodic cardiac stress testing is commonly used to screen patients on the waiting list for kidney transplantation for ischemic heart disease, there is little evidence to support this practice. We hypothesized that cardiac stress testing in the 18 months prior to kidney transplantation would not reduce postoperative death, total myocardial infarction (MI) or fatal MI.

Methods

Using the United States Renal Data System, we identified ESRD patients ≥40 years old with primary Medicare insurance who received their first kidney transplant between 7/1/2006 and 11/31/2013. Propensity matching created a 1:1 matched sample of patients with and without stress testing in the 18 months prior to kidney transplantation. The outcomes of interest were death, total (fatal and nonfatal) MI or fatal MI within 30 days of kidney transplantation.

Results

In the propensity-matched cohort of 17,304 patients, death within 30 days occurred in 72 of 8,652 (0.83%) patients who underwent stress testing and in 65 of 8,652 (0.75%) patients who did not (OR 1.07; 95% CI: 0.79–1.45; P = 0.66). MI within 30 days occurred in 339 (3.9%) patients who had a stress test and in 333 (3.8%) patients who did not (OR 1.03; 95% CI: 0.89–1.21; P = 0.68). Fatal MI occurred in 17 (0.20%) patients who underwent stress testing and 15 (0.17%) patients who did not (OR 0.97; 95% CI: 0.71–1.32; P = 0.84).

Conclusion

Stress testing in the 18 months prior to kidney transplantation is not associated with a reduction in death, total MI or fatal MI within 30 days of kidney transplantation.

]]>
<![CDATA[Outcomes and challenges of a kidney transplant programme at Groote Schuur Hospital, Cape Town: A South African perspective]]> https://www.researchpad.co/article/5c57e6afd5eed0c484ef3b68

Introduction

Access to dialysis and transplantation in the developing world remains limited. Therefore, optimising renal allograft survival is essential. This study aimed to evaluate clinical outcomes and identify poor prognostic factors in the renal transplant programme at Groote Schuur Hospital [GSH], Cape Town.     

Method

Data were collected on all patients who underwent a kidney transplant at GSH from 1st July 2010 to the 30 June 2015. Analyses were performed to assess baseline characteristics, graft and patient survival, as well as predictors of poor outcome.    

Results

198 patients were transplanted. The mean age was 38 +/- 10.5 years, 127 (64.1%) were male, and 86 (43.4%) were of African ethnicity. Deceased donor organs were used for 130 (66.7%) patients and living donors for 65 (33.3%). There were > 5 HLA mismatches in 58.9% of transplants. Sepsis was the commonest cause of death and delayed graft function [DGF] occurred in 41 (21.4%) recipients. Patient survival was 90.4% at 1 year and 83.1% at 5 years. Graft survival was 89.4% at 1 year and 80.0% at 5 years. DGF (HR 2.83 (1.12–7.19), p value = 0.028) and recipient age > 40 years (HR 3.12 (1.26–7.77), p value = 0.014) were predictors of death.

Conclusion

Despite the high infectious burden, stratified immunosuppression and limited tissue typing this study reports encouraging results from a resource constrained transplant programme in South Africa. Renal transplantation is critical to improve access to treatment of end stage kidney disease where access to dialysis is limited.

]]>
<![CDATA[A new formula to calculate the resection limit in hepatectomy based on Gd-EOB-DTPA-enhanced magnetic resonance imaging]]> https://www.researchpad.co/article/5c644885d5eed0c484c2e84f

Background and aim

Dynamic magnetic resonance imaging with gadolinium-ethoxybenzyl-diethylenetriamine pentaacetic acid (EOB-MRI) can be used not only to detect liver tumors but also to estimate liver function. The aim of this study was to establish a new EOB-MRI-based formula to determine the resection limit in patients undergoing hepatectomy.

Methods

Twenty-eight patients with a normal liver (NL group) and five with an unresectable cirrhotic liver (UL group) who underwent EOB-MRI were included. Standardized liver function (SLF) was calculated based on the signal intensity (SI), the volume of each subsegment (S1–S8), and body surface area. A formula defining the resection limit was devised based on the difference in the SLF values of patients in the NL and UL groups. The formula was validated in 50 patients who underwent EOB-MRI and hepatectomy.

Results

The average SLF value in the NL and UL groups was 2038 and 962 FV/m2, respectively. The difference (1076 FV/m2) was consistent with a 70% in resection volume. Thus, the resection limit for hepatectomy was calculated as a proportion of 70%: 70×(SLF−962)/1076 (%). The one patient who underwent hepatectomy over the resection limit died due to liver failure. In other 49 patients, in whom the resection volume was less than the resection limit, procedures were safely performed.

Conclusions

Our formula for resection limit based on EOB-MRI can improve the safety of hepatectomy.

]]>
<![CDATA[Cytomegalovirus viral load parameters associated with earlier initiation of pre-emptive therapy after solid organ transplantation]]> https://www.researchpad.co/article/5c6448bcd5eed0c484c2ecea

Background

Human cytomegalovirus (HCMV) can be managed by monitoring HCMV DNA in the blood and giving valganciclovir when viral load exceeds a defined value. We hypothesised that such pre-emptive therapy should occur earlier than the standard 3000 genomes/ml (2520 IU/ml) when a seropositive donor transmitted virus to a seronegative recipient (D+R-) following solid organ transplantation (SOT).

Methods

Our local protocol was changed so that D+R- SOT patients commenced valganciclovir once the viral load exceeded 200 genomes/ml; 168 IU/ml (new protocol). The decision point remained at 3000 genomes/ml (old protocol) for the other two patient subgroups (D+R+, D-R+). Virological outcomes were assessed three years later, when 74 D+R- patients treated under the old protocol could be compared with 67 treated afterwards. The primary outcomes were changes in peak viral load, duration of viraemia and duration of treatment in the D+R- group. The secondary outcome was the proportion of D+R- patients who developed subsequent viraemia episodes.

Findings

In the D+R- patients, the median values of peak viral load (30,774 to 11,135 genomes/ml, p<0.0215) were significantly reduced on the new protocol compared to the old, but the duration of viraemia and duration of treatment were not. Early treatment increased subsequent episodes of viraemia from 33/58 (57%) to 36/49 (73%) of patients (p< 0.0743) with a significant increase (p = 0.0072) in those episodes that required treatment (16/58; 27% versus 26/49; 53%). Median peak viral load increased significantly (2,103 to 3,934 genomes/ml, p<0.0249) in the D+R+ but not in the D-R+ patient subgroups. There was no change in duration of viraemia or duration of treatment for any patient subgroup.

Interpretation

Pre-emptive therapy initiated at the first sign of viraemia post-transplant significantly reduced the peak viral load but increased later episodes of viraemia, consistent with the hypothesis of reduced antigenic stimulation of the immune system.

]]>
<![CDATA[Outcomes of cardiac resynchronization therapy in patients with atrial fibrillation accompanied by slow ventricular response]]> https://www.researchpad.co/article/5c424385d5eed0c4845e0487

It remains unclear as to whether cardiac resynchronization therapy (CRT) would be as effective in patients with atrial fibrillation (AF) accompanied by slow ventricular response (AF-SVR, < 60 beats/min) as in those with sinus rhythm (SR). Echocardiographic reverse remodeling was compared between AF-SVR patients (n = 17) and those with SR (n = 88) at six months and 12 months after CRT treatment. We also evaluated the changes in QRS duration; New York Heart Association (NYHA) functional class; and long-term composite clinical outcomes including cardiac death, heart transplantation, and heart failure (HF)-related hospitalization. Left ventricular pacing sites and biventricular pacing percentages were not significantly different between the AF-SVR and SR groups. However, heart rate increase after CRT was significantly greater in the AF-SVR group than in the SR group (P < 0.001). At six and 12 months postoperation, both groups showed a comparable improvement in NYHA class; QRS narrowing; and echocardiographic variables including left ventricular end-systolic volume, left ventricular ejection fraction, and left atrial volume index. Over the median follow-up duration of 1.6 (interquartile range: 0.8–2.2) years, no significant between-group differences were observed regarding the rates of long-term composite clinical events (35% versus 24%; hazard ratio: 1.71; 95% confidence interval: 0.23–12.48; P = 0.60). CRT implantation provided comparable beneficial effects for patients with AF-SVR as compared with those with SR, by correcting electrical dyssynchrony and increasing biventricular pacing rate, in terms of QRS narrowing, symptom improvement, ventricular reverse remodeling, and long-term clinical outcomes.

]]>
<![CDATA[Polyomavirus BK, BKV microRNA, and urinary neutrophil gelatinase-associated lipocalin can be used as potential biomarkers of lupus nephritis]]> https://www.researchpad.co/article/5c466538d5eed0c484518127

Objective

Lupus nephritis (LN) frequently progresses to end-stage renal disease. Finding a biomarker for LN and a predictor for the development of chronic kidney disease (CKD) is important for patients with systemic lupus erythematosus (SLE).

Methods

Ninety patients with SLE were divided into biopsy-proven LN (n = 54) and no kidney involvement (non-LN) (n = 36) groups and followed up for 54 months.

Results

Of 36 patients with LN, 3 (5.6%) had class II disease, 3 (5.6%) had class III, 35 (64.8%) had class IV, 10 (18.5%) had class V, and 3 (5.6%) had class VI (advanced sclerosis). Compared to the non-LN group, patients in the LN group had higher autoimmunity evidenced by a higher proportion of low C3 and C4 levels, positive anti-double-stranded DNA antibody levels, and lower estimated glomerular filtration rates (eGFR). Urinary neutrophil gelatinase-associated lipocalin (uNGAL) levels were significantly higher in the LN group (LN vs non-LN, 670 vs 33 ng/mL, respectively). The patients with LN had a higher urinary polyomavirus BK (BKV) load (3.6 vs 3.0 log copies/mL) and a lower urinary BKV miRNA (miR-B1) 5p level (0.29 vs 0.55 log copies/mL, p = 0.025), while there was no significant difference in the level of miR-B1-3p. Urinary miR-B1-5p level but not urinary BKV load was negatively correlated with uNGAL level (r = -0.22, p = 0.004). At the cutoff value of 80 ng/mL, the receiver operating characteristic curve analysis showed that uNGAL level as a predictor of the presence of LN had a high sensitivity (98%) and specificity (100%) (area under the curve [AUC], 0.997; p < 0.001). During the 54-month follow-up period, 14 (7%) patients with LN and none of the non-LN patients developed CKD. Multivariate Cox regression analysis revealed that baseline uNGAL level was the only predictive factor for CKD development, while baseline serum creatinine level and eGFR were not.

Conclusion

An elevated urinary BKV viral load with a decreased level of miR-B1 implies the presence of LN. In addition, an increased uNGAL level is a good biomarker not only in predicting the presence of LN but also for prediction of CKD development in patients with SLE.

]]>
<![CDATA[Alemtuzumab induction combined with reduced maintenance immunosuppression is associated with improved outcomes after lung transplantation: A single centre experience]]> https://www.researchpad.co/article/5c478c9dd5eed0c484bd36c6

Question addressed by the study

The value of induction therapy in lung transplantation is controversial. According to the ISHLT, only about 50% of patients transplanted within the last 10 years received induction therapy. We reviewed our institutional experience to investigate the impact of induction therapy on short- and long-term outcomes.

Materials/Patients and methods

Between 2007 and 2015, 446 patients with a complete follow-up were included in this retrospective analysis. Analysis comprised long-term kidney function, infectious complications, incidence of rejection and overall survival.

Results

A total of 231 patients received alemtuzumab, 50 patients antithymocyte globulin (ATG) and 165 patients did not receive induction therapy (NI). The alemtuzumab group revealed the lowest rate of chronic kidney insufficiency (NI: 52.2%; ATG: 60%; alemtuzumab: 36.6%; p = 0.001). Both, the NI group (p<0.001) and the ATG group (p = 0.010) showed a significant increase of serum creatinine during follow-up compared to alemtuzumab patients. Furthermore, alemtuzumab group experienced the lowest rate of infection in the first year after transplantation. Finally, improved survival, low rates of acute cellular rejection (ACR), lymphocytic bronchiolitis (LB) and chronic lung allograft dysfunction (CLAD) were found in patients treated either with alemtuzumab or ATG.

Conclusion

Alemtuzumab induction therapy followed by reduced maintenance immunosuppression is associated with a better kidney function compared to no induction and ATG. Survival rate as well as freedom from ACR and CLAD were comparable between alemtuzumab and ATG.

]]>