ResearchPad - liver-transplantation https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[A pilot study of ex-vivo MRI-PDFF of donor livers for assessment of steatosis and predicting early graft dysfunction]]> https://www.researchpad.co/article/elastic_article_14544 The utility of ex vivo Magnetic resonance imaging proton density fat fraction (MRI-PDFF) in donor liver fat quantification is unknown.PurposeTo evaluate the diagnostic accuracy and utility in predicting early allograft dysfunction (EAD) of ex vivo MRI-PDFF measurement of fat in deceased donor livers using histology as the gold standard.MethodsWe performed Ex vivo, 1.5 Tesla MRI-PDFF on 33 human deceased donor livers before implantation, enroute to the operating room. After the exclusion of 4 images (technical errors), 29 MRI images were evaluable. Histology was evaluable in 27 of 29 patients. EAD was defined as a peak value of aminotransferase >2000 IU/mL during the first week or an INR of ≥1.6 or bilirubin ≥10 mg/dL at day 7.ResultsMRI-PDFF values showed a strong positive correlation (Pearson’s correlation coefficient) when histology (macro-steatosis) was included (r = 0.78, 95% confidence interval 0.57‐0.89, p<0.0001). The correlation appeared much stronger when macro plus micro-steatosis were included (r = 0.87, 95% confidence interval 0.72‐0.94, p<0.0001). EAD was noted in 7(25%) subjects. AUC (Area Under the Curve) for macro steatosis (histology) predicted EAD in 73% (95% CI: 48–99), micro plus macro steatosis in 76% (95% CI: 49–100). AUC for PDFF values predicted EAD in 67(35–98). Comparison of the ROC curves in a multivariate model revealed, adding MRI PDFF values to macro steatosis increased the ability of the model in predicting EAD (AUC: 79%, 95% CI: 59–99), and addition of macro plus micro steatosis based on histology predicted EAD even better (AUC: 90%: 79–100, P = 0.054).ConclusionIn this pilot study, MRI-PDFF imaging showed potential utility in quantifying hepatic steatosis ex-vivo donor liver evaluation and the ability to predict EAD related to severe allograft steatosis in the recipient. ]]> <![CDATA[Improvement of steatotic rat liver function with a defatting cocktail during <i>ex situ</i> normothermic machine perfusion is not directly related to liver fat content]]> https://www.researchpad.co/article/elastic_article_13803 There is a significant organ shortage in the field of liver transplantation, partly due to a high discard rate of steatotic livers from donors. These organs are known to function poorly if transplanted but make up a significant portion of the available pool of donated livers. This study demonstrates the ability to improve the function of steatotic rat livers using a combination of ex situ machine perfusion and a “defatting” drug cocktail. After 6 hours of perfusion, defatted livers demonstrated lower perfusate lactate levels and improved bile quality as demonstrated by higher bile bicarbonate and lower bile lactate. Furthermore, defatting was associated with decreased gene expression of pro-inflammatory cytokines and increased expression of enzymes involved in mitochondrial fatty acid oxidation. Rehabilitation of marginal or discarded steatotic livers using machine perfusion and tailored drug therapy can significantly increase the supply of donor livers for transplantation.

]]>
<![CDATA[Comparison of Supraceliac and Infrarenal Aortic Conduits in Liver Transplantation: Is There a Difference in Patency and Postoperative Renal Dysfunction?]]> https://www.researchpad.co/article/Neae4d56e-c2a7-4b49-a3f9-29247d513cf3

Background.

Aorto-hepatic conduits can provide arterial inflow for liver transplants in cases where the native hepatic artery is unsuitable for use.

Methods.

Clinical outcomes of all patients undergoing liver transplantation (LT) with an aorto-hepatic conduit between 2000 and 2016 were included. Recipients were divided into 2 groups: those with a supraceliac (SC) aortic conduit (N = 22) and those with an infrarenal (IR) aortic conduit (N = 82).

Results.

There was no difference in calculated model for end-stage liver disease score between the 2 groups. The SC group received grafts with a higher mean donor risk index (1.69 versus 1.48; P = 0.02). Early allograft dysfunction was 18.2% in the SC group and 29.3% in the IR group (P = 0.30). In the SC group, 10.5% of patients required initiation of postoperative continuous renal replacement therapy compared to 12.1% of patients in the IR group (P = 0.69). No difference in the rate of postoperative acute kidney injury was seen between the 2 groups (P = 0.54). No significant difference in median creatinine at 1 year was seen between the SC (1.2 mg/dL; IQR 1–1.3) and IR (1.2 mg/dL; IQR 0.9–1.5) groups (P = 0.85). At a median follow-up of 5.3 years, thrombosis of the aortic conduit occurred in 0% of patients in the SC group and 6.1% of patients in the IR group (P = 0.24). Graft survival was not significantly different between the 2 groups (P = 0.47).

Conclusions.

No difference in renal dysfunction as demonstrated by need for post-LT continuous renal replacement therapy, acute kidney injury, or creatinine at 1 year post-LT was seen between SC and IR aortic conduits. A slight trend of higher conduit thrombosis rate was seen with IR compared to SC aortic conduits; however, this did not reach statistical significance. Both SC and IR aortic conduits represent reasonable options when the native hepatic artery is unsuitable for use.

]]>
<![CDATA[Regeneration of esophagus using a scaffold-free biomimetic structure created with bio-three-dimensional printing]]> https://www.researchpad.co/article/5c8c1978d5eed0c484b4d71e

Various strategies have been attempted to replace esophageal defects with natural or artificial substitutes using tissue engineering. However, these methods have not yet reached clinical application because of the high risks related to their immunogenicity or insufficient biocompatibility. In this study, we developed a scaffold-free structure with a mixture of cell types using bio-three-dimensional (3D) printing technology and assessed its characteristics in vitro and in vivo after transplantation into rats. Normal human dermal fibroblasts, human esophageal smooth muscle cells, human bone marrow-derived mesenchymal stem cells, and human umbilical vein endothelial cells were purchased and used as a cell source. After the preparation of multicellular spheroids, esophageal-like tube structures were prepared by bio-3D printing. The structures were matured in a bioreactor and transplanted into 10-12-week-old F344 male rats as esophageal grafts under general anesthesia. Mechanical and histochemical assessment of the structures were performed. Among 4 types of structures evaluated, those with the larger proportion of mesenchymal stem cells tended to show greater strength and expansion on mechanical testing and highly expressed α-smooth muscle actin and vascular endothelial growth factor on immunohistochemistry. Therefore, the structure with the larger proportion of mesenchymal stem cells was selected for transplantation. The scaffold-free structures had sufficient strength for transplantation between the esophagus and stomach using silicon stents. The structures were maintained in vivo for 30 days after transplantation. Smooth muscle cells were maintained, and flat epithelium extended and covered the inner surface of the lumen. Food had also passed through the structure. These results suggested that the esophagus-like scaffold-free tubular structures created using bio-3D printing could hold promise as a substitute for the repair of esophageal defects.

]]>
<![CDATA[Influence of donor liver telomere and G-tail on clinical outcome after living donor liver transplantation]]> https://www.researchpad.co/article/5c8acccbd5eed0c484990005

It has been reported that donor age affects patient outcomes after liver transplantation, and that telomere length is associated with age. However, to our knowledge, the impact of donor age and donor liver telomere length in liver transplantation has not been well investigated. This study aimed to clarify the influence of the length of telomere and G-tail from donor livers on the outcomes of living donors and recipients after living donor liver transplantation. The length of telomere and G-tail derived from blood samples and liver tissues of 55 living donors, measured using the hybridization protection assay. The length of telomeres from blood samples was inversely correlated with ages, whereas G-tail length from blood samples and telomere and G-tail lengths from liver tissues were not correlated with ages. Age, telomere, and G-tail length from blood did not affect postoperative liver failure and early liver regeneration of donors. On the other hand, the longer the liver telomere, the poorer the liver regeneration tended to be, especially with significant difference in donor who underwent right hemihepatectomy. We found that the survival rate of recipients who received liver graft with longer telomeres was inferior to that of those who received liver graft with shorter ones. An elderly donor, longer liver telomere, and higher Model for End-Stage Liver Disease score were identified as independent risk factors for recipient survival after transplantation. In conclusion, telomere shortening in healthy liver does not correlate with age, whereas longer liver telomeres negatively influence donor liver regeneration and recipient survival after living donor liver transplantation. These results can direct future studies and investigations on telomere shortening in the clinical and experimental transplant setting.

]]>
<![CDATA[Early predictors of outcomes of hospitalization for cirrhosis and assessment of the impact of race and ethnicity at safety-net hospitals]]> https://www.researchpad.co/article/5c897747d5eed0c4847d28e2

Background

Safety-net hospitals provide care for racially/ethnically diverse and disadvantaged urban populations. Their hospitalized patients with cirrhosis are relatively understudied and may be vulnerable to poor outcomes and racial/ethnic disparities.

Aims

To examine the outcomes of patients with cirrhosis hospitalized at regionally diverse safety-net hospitals and the impact of race/ethnicity.

Methods

A study of patients with cirrhosis hospitalized at 4 safety-net hospitals in 2012 was conducted. Demographic, clinical factors, and outcomes were compared between centers and racial/ethnic groups. Study endpoints included mortality and 30-day readmission.

Results

In 2012, 733 of 1,212 patients with cirrhosis were hospitalized for liver-related indications (median age 55 years, 65% male). The cohort was racially diverse (43% White, 25% black, 22% Hispanic, 3% Asian) with cirrhosis related to alcohol and viral hepatitis in 635 (87%) patients. Patients were hospitalized mainly for ascites (35%), hepatic encephalopathy (20%) and gastrointestinal bleeding (GIB) (17%). Fifty-four (7%) patients died during hospitalization and 145 (21%) survivors were readmitted within 30 days. Mortality rates ranged from 4 to 15% by center (p = .007) and from 3 to 10% by race/ethnicity (p = .03), but 30-day readmission rates were similar. Mortality was associated with Model for End-stage Liver Disease (MELD), acute-on-chronic liver failure, hepatocellular carcinoma, sodium and white blood cell count. Thirty-day readmission was associated with MELD and Charlson Comorbidity Index >4, with lower risk for GIB. We did not observe geographic or racial/ethnic differences in hospital outcomes in the risk-adjusted analysis.

Conclusions

Hospital mortality and 30-day readmission in patients with cirrhosis at safety-net hospitals are associated with disease severity and comorbidities, with lower readmissions in patients admitted for GIB. Despite geographic and racial/ethnic differences in hospital mortality, these factors were not independently associated with mortality.

]]>
<![CDATA[Significant Hyperfibrinolysis in a Patient With Intracardiac Thrombosis: To Give Antifibrinolytics or Not?]]> https://www.researchpad.co/article/5c973e66d5eed0c48496b6ae

Abstract

The hemostatic system is a delicate balance between the coagulation, anticoagulation, and fibrinolytic systems and is responsible for preventing both hemorrhage and thrombosis. End stage liver disease is characterized by a rebalanced hemostatic system that is fragile and easily tipped towards either hemorrhage or thrombosis. During an orthotopic liver transplantation, patients are exposed to a wide variety of factors that can shift them from a hypercoagulable state to a hypocoagulable state almost instantaneously. The treatment for these two disease states contradict each other, and therefore patients in this condition can be extremely difficult to manage. Here, we present a patient who underwent an orthotopic liver transplantation and suffered an intracardiac thrombosis shortly after reperfusion of the donor graft, that resolved with supportive care, who then went on to develop severe persistent hyperfibrinolysis and massive hemorrhage that was successfully treated with an antifibrinolytic agent.

]]>
<![CDATA[Early Persistent Progressive Acute Kidney Injury and Graft Failure Post Liver Transplantation]]> https://www.researchpad.co/article/5c973e62d5eed0c48496b642

Background

Acute kidney injury (AKI) in the setting of liver transplantation is a common and multifaceted complication. Studies in the general population have demonstrated worse prognosis with AKI episodes that persist for a longer duration. Our primary objective was to evaluate the impact of early AKI episodes that are persistent or progressive in nature, on patient outcomes and graft survival.

Methods

This was a retrospective cohort study including all patients who received a liver transplant between 2011 and 2015 at our center. Moderate to severe AKI episodes (AKIN II or III) were recorded immediately before transplantation and after surgery until hospital discharge. We evaluated the incidence density rate (IDR) of graft failure and the time to graft failure in patients with persistent or progressive AKI (ppAKI) as compared to controls.

Results

Two hundred seventy-nine patients received 301 deceased donor liver allografts. Progressive or persistent AKI was documented in more than half of transplant cases (152/301). The rate of graft loss was 3 times higher in the ppAKI group (25%) versus the controls (8.7%). The IDR of graft failure was 13.79 per 100 case-years in the ppAKI group as compared with 3.79 per 100 case-years in the controls (IDR ratio, 3.64; 95 % confidence interval, 1.88–7.50). After adjusting for hepatic artery thrombosis, ischemic cholangiopathy, infectious complications and Model for End-stage Liver Disease, ppAKI was associated with a decreased graft survival time.

Conclusions

Persistent or progressive AKI after liver transplantation is associated with an increased incidence rate of graft failure and is an independent predictor of decreased graft survival time.

]]>
<![CDATA[Liver Transplantation for Recurrent Cholangitis From Von Meyenburg Complexes]]> https://www.researchpad.co/article/5c973e60d5eed0c48496b62e

Abstract

Von Meyenburg complexes, or multiple biliary hamartomas, are often asymptomatic lesions incidentally discovered during abdominal or hepatic imaging. The presentation of clinically significant Von Meyenburg complexes ranges from cholestasis and self-limited episodes of cholangitis to malignant degeneration into cholangiocarcinoma. In cases of persistent or recurrent cholangitis, treatment is a significant challenge. Definitive source control with liver transplantation, as in other cases of cholestatic liver disease, may be necessary.

]]>
<![CDATA[Long-term Outcome of Endoscopic and Percutaneous Transhepatic Approaches for Biliary Complications in Liver Transplant Recipients]]> https://www.researchpad.co/article/5c973e5bd5eed0c48496b58f

Background

Biliary complications occur in 6% to 34% of liver transplant recipients, for which endoscopic retrograde cholangiopancreatography has become widely accepted as the first-line therapy. We evaluated long-term outcome of biliary complications in patients liver transplanted between 2004 and 2014 at Karolinska University Hospital, Stockholm.

Methods

Data were retrospectively collected, radiological images were analyzed for type of biliary complication, and graft and patient survivals were calculated.

Results

In 110 (18.5%) of 596 transplantations, there were a total of 153 cases of biliary complications: 68 (44.4%) anastomotic strictures, 43 (28.1%) nonanastomotic strictures, 24 (15.7%) bile leaks, 11 (7.2%) cases of stone- and/or sludge-related problems, and 7 (4.6%) cases of mixed biliary complications. Treatment success rates for each complication were 90%, 73%, 100%, 82% and 80%, respectively. When the endoscopic approach was unsatisfactory or failed, percutaneous transhepatic cholangiography or a combination of treatments was often successful (in 18 of 24 cases). No procedure-related mortality was observed. Procedure-related complications were reported in 7.7% of endoscopic retrograde cholangiopancreatography and 3.8% of percutaneous transhepatic cholangiography procedures. Patient survival rates, 1, 3, 5, and 10 years posttransplant in patients with biliary complications were 92.7%, 80%, 74.7%, and 54.1%, respectively, compared with 92%, 86.6%, 83.7%, and 72.8% in patients free from biliary complications (P < 0.01). Similarly, long-term graft survival was lower in the group experiencing biliary complications (P < 0.0001).

Conclusions

Endoscopic and percutaneous approaches for treating biliary complications are safe and efficient and should be considered complementing techniques. Despite a high treatment success rate of biliary complications, their occurrence still has a significant negative impact on patient and graft long-term survivals.

]]>
<![CDATA[The Changing Face of Liver Transplantation in the United States: The Effect of HCV Antiviral Eras on Transplantation Trends and Outcomes]]> https://www.researchpad.co/article/5c973e59d5eed0c48496b55d

Background

Hepatitis C virus (HCV) cirrhosis is the leading indication for liver transplantation in the United States, although nonalcoholic steatohepatitis (NASH) is on the rise. Increasingly effective HCV antivirals are available, but their association with diagnosis-specific liver transplantation rates and early graft survival is not known.

Methods

The Scientific Registry of Transplant Recipients database records were retrospectively stratified by HCV antiviral era: interferon (2003-2010), protease inhibitors (2011-2013), and direct-acting antivirals (2014 to present). Kaplan-Meier, χ2, and multivariable Cox proportional hazards regression models evaluated the effects of antiviral era and etiology of liver disease on transplantation rates and graft survival over 3 years.

Results

Liver transplants for HCV decreased (35.3% to 23.6%), whereas those for NASH and alcoholic liver disease increased (5.8% to 16.5% and 15.6% to 24.0%) with each advancing era (all P < 0.05). Early graft survival improved with each advancing era for HCV but not for hepatitis B virus, NASH, or alcoholic liver disease (multivariable model era by diagnosis interaction P < 0.001). Era-specific multivariable models demonstrated that the risk of early graft loss for NASH was 22% lower than for HCV in the interferon era (hazard ratio, 0.78; 95% confidence interval, 0.64-0.96; P = 0.02) but risks associated with these diagnoses did not differ significantly in the protease inhibitor (P = 0.06) or direct-acting antiviral eras (P = 0.08).

Conclusions

Increasing effectiveness of HCV antivirals corresponds with decreased rates of liver transplantation for HCV and improved early graft survival. As the rates of liver transplant for NASH continue to increase, focus will be needed on the prevention and effective therapies for this disease.

]]>
<![CDATA[Combining liver stiffness with hyaluronic acid provides superior prognostic performance in chronic hepatitis C]]> https://www.researchpad.co/article/5c6b26b4d5eed0c484289ee1

Background

Non-invasive methods are the first choice for liver fibrosis evaluation in chronic liver diseases, but few studies investigate the ability of combined methods to predict outcomes.

Methods

591 chronic hepatitis C patients with baseline liver stiffness (LSM) by FibroScan and hyaluronic acid measurements were identified retrospectively. The patients were grouped by baseline LSM: < 10kPa, 10–16.9kPa, and 17-75kPa. Primary outcomes were all-cause mortality and liver-related mortality, analyzed using cox regression and competing risk regression models, respectively.

Results

Median follow-up was 46.1 months. Prevalence of cirrhosis at baseline was 107/591 (18.1%). Median LSM was 6.8kPa (IQR 5.3–11.6) and divided into groups, 404/591 (68.4%) had a LSM < 10kPa, 100/591 (16.9%) had a LSM between 10–16.9kPa and 87/591 (14.7%) had a LSM between 17-75kPa. There were 69 deaths, 27 from liver-related disease. 26 patients developed cirrhosis and 30 developed complications of cirrhosis. The mortality rate in the 17-75kPa group was 9.7/100 person-years, compared to 2.2/100 person-years and 1.1/100 person-years in the 10–16.9kPa and <10kPa groups (p<0.005). Liver-related mortality increased 10-fold for each group (p<0.005). Cirrhotic complications occurred almost exclusively in the 17-75kPa group, with an incidence of 10.3/100 person-years, compared to 1.8/100 person-years and 0.2/100 person-years in the 10–16.9kPa and <10kPa groups (p<0.005). Median hyaluronic acid in the 17-75kPa group was approximately 200ng/mL. Patients with a LSM 17-75kPa had significantly higher risks of death, liver-related death, and complications to cirrhosis if their hyaluronic acid measurement was more than or equal to 200ng/mL at baseline, with hazard ratios of 3.25 (95% CI 1.48–7.25), 7.7 (95% CI 2.32–28), and 3.2 (95% CI 1.35–7.39), respectively.

Conclusions

The combination of LSM and circulating hyaluronic acid measurements significantly improved prognostic ability, relative to LSM alone. Combined static and dynamic markers of liver fibrosis could provide superior risk prediction.

]]>
<![CDATA[A new formula to calculate the resection limit in hepatectomy based on Gd-EOB-DTPA-enhanced magnetic resonance imaging]]> https://www.researchpad.co/article/5c644885d5eed0c484c2e84f

Background and aim

Dynamic magnetic resonance imaging with gadolinium-ethoxybenzyl-diethylenetriamine pentaacetic acid (EOB-MRI) can be used not only to detect liver tumors but also to estimate liver function. The aim of this study was to establish a new EOB-MRI-based formula to determine the resection limit in patients undergoing hepatectomy.

Methods

Twenty-eight patients with a normal liver (NL group) and five with an unresectable cirrhotic liver (UL group) who underwent EOB-MRI were included. Standardized liver function (SLF) was calculated based on the signal intensity (SI), the volume of each subsegment (S1–S8), and body surface area. A formula defining the resection limit was devised based on the difference in the SLF values of patients in the NL and UL groups. The formula was validated in 50 patients who underwent EOB-MRI and hepatectomy.

Results

The average SLF value in the NL and UL groups was 2038 and 962 FV/m2, respectively. The difference (1076 FV/m2) was consistent with a 70% in resection volume. Thus, the resection limit for hepatectomy was calculated as a proportion of 70%: 70×(SLF−962)/1076 (%). The one patient who underwent hepatectomy over the resection limit died due to liver failure. In other 49 patients, in whom the resection volume was less than the resection limit, procedures were safely performed.

Conclusions

Our formula for resection limit based on EOB-MRI can improve the safety of hepatectomy.

]]>
<![CDATA[Cytomegalovirus viral load parameters associated with earlier initiation of pre-emptive therapy after solid organ transplantation]]> https://www.researchpad.co/article/5c6448bcd5eed0c484c2ecea

Background

Human cytomegalovirus (HCMV) can be managed by monitoring HCMV DNA in the blood and giving valganciclovir when viral load exceeds a defined value. We hypothesised that such pre-emptive therapy should occur earlier than the standard 3000 genomes/ml (2520 IU/ml) when a seropositive donor transmitted virus to a seronegative recipient (D+R-) following solid organ transplantation (SOT).

Methods

Our local protocol was changed so that D+R- SOT patients commenced valganciclovir once the viral load exceeded 200 genomes/ml; 168 IU/ml (new protocol). The decision point remained at 3000 genomes/ml (old protocol) for the other two patient subgroups (D+R+, D-R+). Virological outcomes were assessed three years later, when 74 D+R- patients treated under the old protocol could be compared with 67 treated afterwards. The primary outcomes were changes in peak viral load, duration of viraemia and duration of treatment in the D+R- group. The secondary outcome was the proportion of D+R- patients who developed subsequent viraemia episodes.

Findings

In the D+R- patients, the median values of peak viral load (30,774 to 11,135 genomes/ml, p<0.0215) were significantly reduced on the new protocol compared to the old, but the duration of viraemia and duration of treatment were not. Early treatment increased subsequent episodes of viraemia from 33/58 (57%) to 36/49 (73%) of patients (p< 0.0743) with a significant increase (p = 0.0072) in those episodes that required treatment (16/58; 27% versus 26/49; 53%). Median peak viral load increased significantly (2,103 to 3,934 genomes/ml, p<0.0249) in the D+R+ but not in the D-R+ patient subgroups. There was no change in duration of viraemia or duration of treatment for any patient subgroup.

Interpretation

Pre-emptive therapy initiated at the first sign of viraemia post-transplant significantly reduced the peak viral load but increased later episodes of viraemia, consistent with the hypothesis of reduced antigenic stimulation of the immune system.

]]>
<![CDATA[Devising focused strategies to improve organ donor registrations: A cross-sectional study among professional drivers in coastal South India]]> https://www.researchpad.co/article/5c26973fd5eed0c48470f06b

Background

In India, annually, 500,000 people die due to non-availability of organs. Given the large proportion of brain death amongst road accident victims, any improvement in organ donation practices amongst this cohort could potentially address this deficit. In this study, we identify the potential areas for intervention to improve organ donation amongst professional drivers, a population more likely to suffer from road accidents.

Methods

300 participants were surveyed using a structured, orally-administered questionnaire to assess knowledge, attitudes and practices regarding organ donation. Multivariate analysis was performed to identify key variables affecting intent to practice.

Results

Nearly half our participants had unsatisfactory knowledge and attitude scores. Knowledge and attitude was positively correlated, rs (298) = .247, p < .001, with better scores associated with a higher likelihood of intent to practice organ donation [AOR: 2.23 (1.26–3.94), p = .006; AOR: 12.164 (6.85–21.59), p < .001 respectively]. Lack of family support and fear of donated organs going into medical research were the key barriers for the same [AOR: 0.43 (0.19–0.97), p = .04; AOR: 0.27 (0.09–0.85), p = .02 respectively].

Conclusion

Targeted health-education, behaviour change communication, and legal interventions, in conjunction, are key to improving organ donor registrations.

]]>
<![CDATA[High seroprevalence of Strongyloides stercoralis among individuals from endemic areas considered for solid organ transplant donation: A retrospective serum-bank based study]]> https://www.researchpad.co/article/5c09940cd5eed0c4842ae1af

Background

Strongyloides stercoralis is a worldwide disseminated parasitic disease that can be transmitted from solid organ transplant (SOT) donors to recipients. We determined the serological prevalence of S. stercoralis among deceased individuals from endemic areas considered for SOT donation, using our institution’s serum bank.

Methodology

Retrospective study including all deceased potential donors from endemic areas of strongyloidiasis considered for SOT between January 2004 and December 2014 in a tertiary care hospital. The commercial serological test IVD-Elisa was used to determine the serological prevalence of S. stercoralis.

Principal findings

Among 1025 deceased individuals during the study period, 90 were from endemic areas of strongyloidiasis. There were available serum samples for 65 patients and 6 of them tested positive for S. stercoralis (9.23%). Only one of the deceased candidates was finally a donor, without transmitting the infection.

Conclusions

Among deceased individuals from endemic areas considered for SOT donation, seroprevalence of strongyloidiasis was high. This highlights the importance of adhering to current recommendations on screening for S. stercoralis among potential SOT donors at high risk of the infection, together with the need of developing a rapid diagnostic test to fully implement these screening strategies.

]]>
<![CDATA[Value of Bone Scans in Work-up of Patients With Hepatocellular Carcinoma for Liver Transplant]]> https://www.researchpad.co/article/5c2a7768d5eed0c4842262fc

Background

The purpose of this study was to review the value of bone scans (BS) in the assessment of bone metastases from early-stage hepatocellular carcinoma (HCC) in patients assessed or waiting for liver transplant (LTx).

Methods

We reviewed BS studies performed at our center for patients with early-stage HCC either being assessed for LTx, or on the waiting list for LTx, from January 2010 to May 2017. The BS findings were classified as positive, equivocal, or negative. Correlation with final outcome based on clinical and radiological follow-up was performed.

Results

There were 360 BS performed in 186 patients during the study period with a mean age of 58.7 years (range, 34.9-70.4 years) and most were male patients (161/186 [86.6%]). None of the BSs resulted in delisting of patients from the LTx waiting list. Three BSs were reported as positive for metastases. All 3 were proven to be false positives on follow-up. Fourteen studies reported equivocal findings, none of which were confirmed to be metastases on follow-up. There was 1 false-negative BS: a bone metastasis was detected incidentally on magnetic resonance imaging and proven on biopsy.

Conclusions

We have demonstrated that the diagnostic yield of BS in early HCC patients who are candidates for LTx is minimal, challenging the current inclusion of BS in guidelines for staging these HCC patients.

]]>
<![CDATA[Plasma donor-derived cell-free DNA kinetics after kidney transplantation using a single tube multiplex PCR assay]]> https://www.researchpad.co/article/5c12cf5ad5eed0c484914451

Background

After transplantation, cell-free DNA derived from the donor organ (ddcfDNA) can be detected in the recipient’s circulation. We aimed to quantify ddcfDNA levels in plasma of kidney transplant recipients thereby investigating the kinetics of this biomarker after transplantation and determining biological variables that influence ddcfDNA kinetics in stable and non-stable patients.

Materials and methods

From 107 kidney transplant recipients, plasma samples were collected longitudinally after transplantation (day 1–3 months) within a multicenter set-up. Cell-free DNA from the donor was quantified in plasma as a fraction of the total cell-free DNA by next generation sequencing using a targeted, multiplex PCR-based method for the analysis of single nucleotide polymorphisms. A subgroup of stable renal transplant recipients was identified to determine a ddcfDNA threshold value.

Results

In stable transplant recipients, plasma ddcfDNA% decreased to a mean (SD) ddcfDNA% of 0.46% (± 0.21%) which was reached 9.85 (± 5.6) days after transplantation. A ddcfDNA threshold value of 0.88% (mean + 2SD) was determined in kidney transplant recipients. Recipients that did not reach this threshold ddcfDNA value within 10 days after transplantation showed a higher ddcfDNA% on the first day after transplantation and demonstrated a higher individual baseline ddcfDNA%.

Conclusion

In conclusion, plasma ddcfDNA fractions decreased exponentially within 10 days after transplantation to a ddcfDNA threshold value of 0.88% or less. To investigate the role of ddcfDNA for rejection monitoring of the graft, future research is needed to determine causes of ddcfDNA% increases above this threshold value.

]]>
<![CDATA[Development of a Predictive Model for Hyperglycemia in Nondiabetic Recipients After Liver Transplantation]]> https://www.researchpad.co/article/5c15fb2fd5eed0c4842f295e

Background

Posttransplant hyperglycemia has been associated with increased risks of transplant rejection, infections, length of stay, and mortality.

Methods

To establish a predictive model to identify nondiabetic recipients at risk for developing postliver transplant (LT) hyperglycemia, we performed this secondary, retrospective data analysis of a single-center, prospective, randomized, controlled trial of glycemic control among 107 adult LT recipients in the inpatient period. Hyperglycemia was defined as a posttransplant glucose level greater than 200 mg/dL after initial discharge up to 1 month following surgery. Candidate variables with P less than 0.10 in univariate analyses were used to build a multivariable logistic regression model using forward stepwise selection. The final model chosen was based on statistical significance and additive contribution to the model based on the Bayesian Information Criteria.

Results

Forty-three (40.2%) patients had at least 1 episode of hyperglycemia after transplant after the resolution of the initial postoperative hyperglycemia. Variables selected for inclusion in the model (using model optimization strategies) included length of hospital stay (odds ratio [OR], 0.83; P < 0.001), use of glucose-lowering medications at discharge (OR, 3.76; P = 0.03), donor female sex (OR, 3.18; P = 0.02) and donor white race (OR, 3.62; P = 0.01). The model had good calibration (Hosmer-Lemeshow goodness-of-fit test statistic = 9.74, P = 0.28) and discrimination (C-statistic = 0.78; 95% confidence interval, 0.65-0.81, bias-corrected C-statistic = 0.78).

Conclusions

Shorter hospital stay, use of glucose-lowering medications at discharge, donor female sex and donor white race are important determinants in predicting hyperglycemia in nondiabetic recipients after hospital discharge up to 1 month after liver transplantation.

]]>
<![CDATA[Bile Acids and Dysbiosis in Non-Alcoholic Fatty Liver Disease]]> https://www.researchpad.co/article/5989d9f6ab0ee8fa60b701f0

Background & Aims

Non-alcoholic fatty liver disease (NAFLD) is characterized by dysbiosis. The bidirectional effects between intestinal microbiota (IM) and bile acids (BA) suggest that dysbiosis may be accompanied by an altered bile acid (BA) homeostasis, which in turn can contribute to the metabolic dysregulation seen in NAFLD. This study sought to examine BA homeostasis in patients with NAFLD and to relate that with IM data.

Methods

This was a prospective, cross-sectional study of adults with biopsy-confirmed NAFLD (non-alcoholic fatty liver: NAFL or non-alcoholic steatohepatitis: NASH) and healthy controls (HC). Clinical and laboratory data, stool samples and 7-day food records were collected. Fecal BA profiles, serum markers of BA synthesis 7-alpha-hydroxy-4-cholesten-3-one (C4) and intestinal BA signalling, as well as IM composition were assessed.

Results

53 subjects were included: 25 HC, 12 NAFL and 16 NASH. Levels of total fecal BA, cholic acid (CA), chenodeoxycholic acid (CDCA) and BA synthesis were higher in patients with NASH compared to HC (p<0.05 for all comparisons). The primary to secondary BA ratio was higher in NASH compared to HC (p = 0.004), but ratio of conjugated to unconjugated BAs was not different between the groups. Bacteroidetes and Clostridium leptum counts were decreased in in a subset of 16 patients with NASH compared to 25 HC, after adjusting for body mass index and weight-adjusted calorie intake (p = 0.028 and p = 0.030, respectively). C. leptum was positively correlated with fecal unconjugated lithocholic acid (LCA) (r = 0.526, p = 0.003) and inversely with unconjugated CA (r = -0.669, p<0.0001) and unconjugated CDCA (r = - 0.630, p<0.0001). FGF19 levels were not different between the groups (p = 0.114).

Conclusions

In adults with NAFLD, dysbiosis is associated with altered BA homeostasis, which renders them at increased risk of hepatic injury.

]]>