ResearchPad - transfusion-medicine https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Prevalence of anti-hepatitis E virus IgG antibodies in sera from hemodialysis patients in Tripoli, Lebanon]]> https://www.researchpad.co/article/elastic_article_15713 Hepatitis E virus (HEV) is an important global public health concern. Several studies reported a higher HEV prevalence in patients undergoing regular hemodialysis (HD). In Lebanon, the epidemiology of HEV among HD patients has never been investigated previously. In this study, we examine the seroprevalence of HEV infection among 171 HD patients recruited from three hospital dialysis units in Tripoli, North Lebanon. Prevalence of anti-HEV IgG antibodies was evaluated in participant’s sera using a commercial enzyme-linked immunosorbent assay (ELISA). The association of socio-demographic and clinical parameters with HEV infection in patients was also evaluated. Overall, 96 women and 75 men were enrolled in this study. Anti-HEV IgG antibodies were found positive in 37/171 HD patients showing a positivity rate of 21.63%. Among all examined variables, only the age of patients was significantly associated with seropositivity (P = 0.001). This first epidemiological study reveals a high seroprevalence of HEV infection among Lebanese HD patients. However, further evaluations that enroll larger samples and include control groups are required to identify exact causative factors of the important seropositivity rate in this population.

]]>
<![CDATA[Single-center retrospective study of the effectiveness and toxicity of the oral iron chelating drugs deferiprone and deferasirox]]> https://www.researchpad.co/article/5c9900fdd5eed0c484b95e7f

Background

Iron overload, resulting from blood transfusions in patients with chronic anemias, has historically been controlled with regular deferoxamine, but its parenteral requirement encouraged studies of orally-active agents, including deferasirox and deferiprone. Deferasirox, licensed by the US Food and Drug Administration in 2005 based upon the results of randomized controlled trials, is now first-line therapy worldwide. In contrast, early investigator-initiated trials of deferiprone were prematurely terminated after investigators raised safety concerns. The FDA declined market approval of deferiprone; years later, it licensed the drug as “last resort” therapy, to be prescribed only if first-line drugs had failed. We undertook to evaluate the long-term effectiveness and toxicities of deferiprone and deferasirox in one transfusion clinic.

Methods and findings

Under an IRB-approved study, we retrospectively inspected the electronic medical records of consented iron-loaded patients managed between 2009 and 2015 at The University Health Network (UHN), Toronto. We compared changes in liver and heart iron, adverse effects and other outcomes, in patients treated with deferiprone or deferasirox.

Results

Although deferiprone was unlicensed in Canada, one-third (n = 41) of locally-transfused patients had been switched from first-line, licensed therapies (deferoxamine or deferasirox) to regimens of unlicensed deferiprone. The primary endpoint of monitoring in iron overload, hepatic iron concentration (HIC), increased (worsened) during deferiprone monotherapy (mean 10±2–18±2 mg/g; p < 0.0003), exceeding the threshold for life-threatening complications (15 mg iron/g liver) in 50% patients. During deferasirox monotherapy, mean HIC decreased (improved) (11±1–6±1 mg/g; p < 0.0001). Follow-up HICs were significantly different following deferiprone and deferasirox monotherapies (p < 0.0000002). Addition of low-dose deferoxamine (<40 mg/kg/day) to deferiprone did not result in reductions of HIC to <15 mg/g (baseline 20±4 mg/g; follow-up, 18±4 mg/g; p < 0.2) or in reduction in the proportion of patients with HIC exceeding 15 mg/g (p < 0.2). During deferiprone exposure, new diabetes mellitus, a recognized consequence of inadequate iron control, was diagnosed in 17% patients, most of whom had sustained HICs exceeding 15 mg/g for years; one woman died after 13 months of a regimen of deferiprone and low-dose deferasirox. During deferiprone exposure, serum ALT increased over baseline in 65% patients. Mean serum ALT increased 6.6-fold (p < 0.001) often persisting for years. During deferasirox exposure, mean ALT was unchanged (p < 0.84). No significant differences between treatment groups were observed in the proportions of patients estimated to have elevated cardiac iron.

Conclusions

Deferiprone showed ineffectiveness and significant toxicity in most patients. Combination with low doses of first-line therapies did not improve the effectiveness of deferiprone. Exposure to deferiprone, over six years while the drug was unlicensed, in the face of ineffectiveness and serious toxicities, demands review of the standards of local medical practice. The limited scope of regulatory approval of deferiprone, worldwide, should restrict its exposure to the few patients genuinely unable to tolerate the two effective, first-line therapies.

]]>
<![CDATA[Cis-AB, the Blood Group of Many Faces, Is a Conundrum to the Novice Eye]]> https://www.researchpad.co/article/5c8ef0d0d5eed0c484f040fb

Cis-AB, a rare ABO variant, is caused by a gene mutation that results in a single glycosyltransferase enzyme with dual A and B glycosyltransferase activities. It is the most frequent ABO subgroup in Korea, and it occurs more frequently in the East Asian region than in the rest of the world. The typical phenotype of cis-AB is A2B3, but it can express various phenotypes when paired with an A or B allele, which can lead to misclassification in the ABO grouping and consequently to adverse hemolytic transfusion reactions. While cis-AB was first discovered as having an unusual inheritance pattern, it was later found that both A and B antigens are expressed from the same allele inherited from a single parent; hence, the name cis-AB. Earlier studies relied on serological and familial investigation of cis-AB subjects, but its detection has become much easier with the introduction of molecular methods. This review will summarize the serological variety, genetic basis and inheritance pattern, laboratory methods of investigation, clinical significance, and the blood type of choice for transfusion for the cis-AB blood group.

]]>
<![CDATA[Red Blood Cell Alloimmunization in Korean Patients With Myelodysplastic Syndrome and Liver Cirrhosis]]> https://www.researchpad.co/article/5c8ef0edd5eed0c484f043fc

Red blood cell (RBC) alloimmunization varies across human populations and ethnic groups. We evaluated the characteristics of RBC alloimmunization and compared the risk of alloimmunization in Korean patients with myelodysplastic syndrome (MDS) and liver cirrhosis (LC), two representative diseases in which chronic transfusion is required. In total, 115 MDS patients and 202 LC patients transfused with RBCs between 2013 and 2015 were retrospectively included. Twenty patients (6.3%) were newly alloimmunized (five MDS patients, 4.3%; 15 LC patients, 7.4%). The median number of RBC units transfused in alloimmunized patients was nine (interquartile range, 4–15 units). As the number of transfused RBC units increased, the cumulative risk of alloimmunization was higher in LC than in MDS patients (P=0.001). The most common alloantibody detected in patients was anti-E (45%), followed by anti-c (17%), anti-e (10%), anti-C (7%), anti-Fyb (7%), and anti-Jka (7%). The present data indicate the need for matching of extended RBC antigens (Rh, Duffy, and Kidd systems) for chronically transfused patients with MDS and LC in Korea.

]]>
<![CDATA[Platelet count abnormalities and peri-operative outcomes in adults undergoing elective, non-cardiac surgery]]> https://www.researchpad.co/article/5c6b2682d5eed0c484289bce

Background

Anemia and transfusion of blood in the peri-operative period have been shown to be associated with increased morbidity and mortality across a wide variety of non-cardiac surgeries. While tests of coagulation, including the platelet count, have frequently been used to identify patients with an increased risk of peri-operative bleeding, results have been equivocal. The aim of this study was to assess the effect of platelet level on outcomes in patients undergoing elective surgery.

Materials and methods

Retrospective cohort analysis of prospectively-collected clinical data from American College of Surgeons National Surgical Quality Improvement Program (NSQIP) between 2006–2016.

Results

We identified 3,884,400 adult patients who underwent elective, non-cardiac surgery from 2006–2016 at hospitals participating in NSQIP, a prospectively-collected, national clinical database with established reproducibility and validity. After controlling for all peri- and intraoperative factors by matching on propensity scores, patients with all levels of thrombocytopenia or thrombocytosis had higher odds for perioperative transfusion. All levels of thrombocytopenia were associated with higher mortality, but there was no association with complications or other morbidity after matching. On the other hand, thrombocytosis was not associated with mortality; but odds for postoperative complications and 30-day return to the operating room remained slightly increased after matching.

Conclusions

These findings may guide surgeons in the appropriate use and appreciation of the utility of pre-operative screening of the platelet count prior to an elective, non-cardiac surgery.

]]>
<![CDATA[Percutaneous nephrolithotomy versus open surgery for surgical treatment of patients with staghorn stones: A systematic review and meta-analysis]]> https://www.researchpad.co/article/5c5ca31ad5eed0c48441f16c

Objectives

To compare the efficacy and safety of percutaneous nephrolithotomy (PCNL) and open surgery (OS) for surgical treatment of patients with staghorn stones based on published literatures.

Materials and methods

A comprehensive literature search of Pubmed, Embase, CNKI and Cochrane Library was conducted to identify studies comparing outcomes of PCNL and OS for treating patients with staghorn stones up to Jan 2018.

Results

There was no significant difference in final-SFR between PCNL and OS (odds ratio[OR]: 1.17; 95% confidence interval [CI]: 0.64, 2.15; p = 0.61), while PCNL provided a significantly lower immediate-SFR compared with OS (OR: 0.29; 95% CI: 0.16, 0.51; P < 0.0001). PCNL provided significantly lower overall complication rate, shorter operative times, hospitalization times, less blood loss and blood transfusion compared with OS (OR: 0.59; 95% CI: 0.41, 0.84; P = 0.004), (weighted mean difference [WMD]: -59.01mins; 95% CI: -81.09, -36.93; p < 0.00001), (WMD: -5.77days; 95% CI: -7.80, -3.74; p < 0.00001), (WMD: -138.29ml; 95% CI: -244.98, -31.6; p = 0.01) and (OR: 0.44; 95% CI: 0.29, 0.68; P = 0.00002), respectively. No significant differences were found in minor complications (Clavien I-II) (OR: 0.72; 95% CI: 0.47, 1.09; p = 0.12) and major complications (Clavien III-V) (OR: 0.5; 95% CI: 0.23, 1.08; P = 0.08). In subgroup analysis, there were no significant differences for overall complications and operative times between mini-PCNL and OS. In sensitivity analysis, there was no significant difference for overall complications between PCNL and OS.

Conclusion

Our analysis suggested that standard PCNL turns out to be a safe and feasible alternative for patients with staghorn stones compared to OS or mini-PCNL. Because of the inherent limitations of the included studies, further large sample, prospective, multi-centric and randomized control trials should be undertaken to confirm our findings.

]]>
<![CDATA[Decreased total iron binding capacity upon intensive care unit admission predicts red blood cell transfusion in critically ill patients]]> https://www.researchpad.co/article/5c521820d5eed0c484797475

Introduction

Red blood cell (RBC) transfusion is associated with poor clinical outcome in critically ill patients. We investigated the predictive value of biomarkers on intensive care units (ICU) admission for RBC transfusion within 28 days.

Methods

Critically ill patients (n = 175) who admitted to our ICU with organ dysfunction and an expected stay of ≥ 48 hours, without hemorrhage, were prospectively studied (derivation cohort, n = 121; validation cohort, n = 54). Serum levels of 12 biomarkers (hemoglobin, creatinine, albumin, interleukin-6 [IL-6], erythropoietin, Fe, total iron binding capacity [TIBC], transferrin, ferritin, transferrin saturation, folate, and vitamin B12) were measured upon ICU admission, days 7, 14, 21 and 28.

Results

Among the 12 biomarkers measured upon ICU admission, levels of hemoglobin, albumin, IL-6, TIBC, transferrin and ferritin were statistically different between transfusion and non-transfusion group. Of 6 biomarkers, TIBC upon ICU admission had the highest area under the curve value (0.835 [95% confidence interval] = 0.765–0.906) for predicting RBC transfusion (cut-off value = 234.5 μg/dL; sensitivity = 0.906, specificity = 0.632). This result was confirmed in validation cohort, whose sensitivity and specificity were 0.888 and 0.694, respectively. Measurement of these biomarkers every seven days revealed that albumin, TIBC and transferrin were statistically different between groups throughout hospitalization until 28 days. In validation cohort, patients in the transfusion group had significantly higher serum hepcidin levels than those in the non-transfusion group (P = 0.004). In addition, joint analysis across derivation and validation cohorts revealed that the serum IL-6 levels were higher in the transfusion group (P = 0.0014).

Conclusion

Decreased TIBC upon ICU admission has high predictive value for RBC transfusion unrelated to hemorrhage within 28 days.

]]>
<![CDATA[Adherence to clinical guidelines is associated with reduced inpatient mortality among children with severe anemia in Ugandan hospitals]]> https://www.researchpad.co/article/5c644881d5eed0c484c2e808

Background

In resource limited settings, there is variability in the level of adherence to clinical guidelines in the inpatient management of children with common conditions like severe anemia. However, there is limited data on the effect of adherence to clinical guidelines on inpatient mortality in children managed for severe anemia.

Methods

We analyzed data from an uncontrolled before and after in-service training intervention to improve quality of care in Lira and Jinja regional referral hospitals in Uganda. Inpatient records of children aged 0 to 5 years managed as cases of ‘severe anemia (SA)’ were reviewed to ascertain adherence to clinical guidelines and compare inpatient deaths in SA children managed versus those not managed according to clinical guidelines. Logistic regression analysis was conducted to evaluate the relationship between clinical care factors and inpatient deaths amongst patients managed for SA.

Results

A total of 1,131 children were assigned a clinical diagnosis of ‘severe anemia’ in the two hospitals. There was improvement in the level of care after the in-service training intervention with more children being managed according to clinical guidelines compared to the period before, 218/510 (42.7%) vs 158/621 (25.4%) (p < 0.001). Overall, children managed according to clinical guidelines had reduced risk of inpatient mortality compared to those not managed according to clinical guidelines, [OR 0.28, (95%, CI 0.14, 0.55), p = 0.001]. Clinical care factors associated with decreased risk of inpatient death included, having pre-transfusion hemoglobin done to confirm diagnosis [OR 0.5; 95% CI 0.29, 0.87], a co-morbid diagnosis of severe malaria [OR 0.4; 95% CI 0.25, 0.76], and being reviewed after admission by a clinician [OR 0.3; 95% CI 0.18, 0.59], while a co-morbid diagnosis of severe acute malnutrition was associated with increased risk of inpatient death [OR 4.2; 95% CI 2.15, 8.22].

Conclusion

Children with suspected SA who are managed according to clinical guidelines have lower in-hospital mortality than those not managed according to the guidelines. Efforts to reduce inpatient mortality in SA children in resource-limited settings should focus on training and supporting health workers to adhere to clinical guidelines.

]]>
<![CDATA[Trauma induced acute kidney injury]]> https://www.researchpad.co/article/5c57e693d5eed0c484ef37fb

Background

Injured patients are at risk of developing acute kidney injury (AKI), which is associated with increased morbidity and mortality. The aim of this study is to describe the incidence, timing, and severity of AKI in a large trauma population, identify risk factors for AKI, and report mortality outcomes.

Methods

A prospective observational study of injured adults, who met local criteria for trauma team activation, and were admitted to a UK Major Trauma Centre. AKI was defined by the Kidney Disease Improving Global Outcomes (KDIGO) criteria. Multivariable logistic regression and Cox proportional hazard modelling was used to analyse parameters associated with AKI and mortality.

Results

Of the 1410 patients enrolled in the study, 178 (12.6%) developed AKI. Age; injury severity score (ISS); admission systolic blood pressure, lactate and serum creatinine; units of Packed Red Blood Cells transfused in first 24 hours and administration of nephrotoxic therapy were identified as independent risk factors for the development of AKI. Patients that developed AKI had significantly higher mortality than those with normal renal function (47/178 [26.4%] versus 128/1232 [10.4%]; OR 3.09 [2.12 to 4.53]; p<0.0001). After adjusting for other clinical prognostic factors, AKI was an independent risk factor for mortality.

Conclusions

AKI is a frequent complication following trauma and is associated with prolonged hospital length of stay and increased mortality. Future research is needed to improve our ability to rapidly identify those at risk of AKI, and develop resuscitation strategies that preserve renal function in trauma patients.

]]>
<![CDATA[Community perceptions of paediatric severe anaemia in Uganda]]> https://www.researchpad.co/article/5c37b796d5eed0c48449058b

Background

Severe anaemia remains a major cause of morbidity and mortality among children in sub-Saharan Africa. There is limited research on the beliefs and knowledge for paediatric severe anaemia in the region. The effect of these local beliefs and knowledge on the healthcare seeking of paediatric severe anaemia remains unknown.

Objective

To describe community perceptions of paediatric severe anaemia in Uganda.

Methods

Sixteen in-depth interviews of caregivers of children treated for severe anaemia and six focus group discussions of community members were conducted in three regions of Uganda between October and November 2017.

Results

There was no common local name used to describe paediatric severe anaemia, but the disease was understood in context as ‘having no blood’. Severe anaemia was identified to be a serious disease and the majority felt blood transfusion was the ideal treatment, but concomitant use of traditional and home remedies was also widespread. Participants articulated signs of severe pediatric anemia, such as palmar, conjunctival, and tongue pallor. Other signs described included jaundice, splenomegaly, difficulty in breathing and poor appetite. Poor feeding, malaria, splenomegaly and evil spirits were perceived to be the common causes of severe anaemia. Other causes included: human immunodeficiency virus (HIV), haemoglobinuria, fever, witchcraft, mosquito bites, and sickle cell. Splenomegaly and jaundice were perceived to be both signs and causes of severe anaemia. Severe anaemia was interpreted to be caused by evil spirits if it was either recurrent, led to sudden death, or manifested with cold extremities.

Conclusion

The community in Uganda perceived paediatric severe anaemia as a serious disease. Their understanding of the signs and perceived causes of severe anaemia to a large extent aligned with known clinical signs and biological causes. Belief in evil spirits persists and may be one obstacle to seeking timely medical care for paediatric severe anaemia.

]]>
<![CDATA[Red blood cell phenotype fidelity following glycerol cryopreservation optimized for research purposes]]> https://www.researchpad.co/article/5c26977bd5eed0c48470fa8f

Intact red blood cells (RBCs) are required for phenotypic analyses. In order to allow separation (time and location) between subject encounter and sample analysis, we developed a research-specific RBC cryopreservation protocol and assessed its impact on data fidelity for key biochemical and physiological assays. RBCs drawn from healthy volunteers were aliquotted for immediate analysis or following glycerol-based cryopreservation, thawing, and deglycerolization. RBC phenotype was assessed by (1) scanning electron microscopy (SEM) imaging and standard morphometric RBC indices, (2) osmotic fragility, (3) deformability, (4) endothelial adhesion, (5) oxygen (O2) affinity, (6) ability to regulate hypoxic vasodilation, (7) nitric oxide (NO) content, (8) metabolomic phenotyping (at steady state, tracing with [1,2,3-13C3]glucose ± oxidative challenge with superoxide thermal source; SOTS-1), as well as in vivo quantification (following human to mouse RBC xenotransfusion) of (9) blood oxygenation content mapping and flow dynamics (velocity and adhesion). Our revised glycerolization protocol (40% v/v final) resulted in >98.5% RBC recovery following freezing (-80°C) and thawing (37°C), with no difference compared to the standard reported method (40% w/v final). Full deglycerolization (>99.9% glycerol removal) of 40% v/v final samples resulted in total cumulative lysis of ~8%, compared to ~12–15% with the standard method. The post cryopreservation/deglycerolization RBC phenotype was indistinguishable from that for fresh RBCs with regard to physical RBC parameters (morphology, volume, and density), osmotic fragility, deformability, endothelial adhesivity, O2 affinity, vasoregulation, metabolomics, and flow dynamics. These results indicate that RBC cryopreservation/deglycerolization in 40% v/v glycerol final does not significantly impact RBC phenotype (compared to fresh cells).

]]>
<![CDATA[Prevalence of Hepatitis B Virus (HBV) Among Blood Donors in Eastern Saudi Arabia: Results From a Five-Year Retrospective Study of HBV Seromarkers]]> https://www.researchpad.co/article/5c354b1ed5eed0c484dc38ad

Background

Transfusion-transmissible hepatitis B virus (HBV) infection is a major problem worldwide. Recently, confirmatory nucleic acid tests (NATs) for HBV DNA have been employed in several countries. We assessed the prevalence and yearly trends of HBV infection in blood donors in the Eastern Province of Saudi Arabia, screening for HBV surface antigen (HBsAg), antibody against HBV core antigen (anti-HBc), and HBV DNA.

Methods

Between 2011 and 2015, a total of 22,842 donors were screenedfor HBsAg, anti-HBc, and HBV DNA using the HBsAg Qualitative II kit (Abbott, Ireland Diagnostics Division, Sligo, Ireland), ARCHITECT Anti-hepatitis B core antigen antibody (HBc) II Assay kit (Abbott GmbH & Co. KG, Wiesbaden, Germany), and NAT Procleix Ultrio Elite Assay kit (Grifols Diagnostic Solutions Inc., Los Angeles, CA, USA), respectively.

Results

A total of 739 (3.24%) donors were HbsAg(+), anti-HBc(+), or HBV DNA(+); 63 (0.28%) were HbsAg(+), anti-HBc(+), and HBV DNA(+). Twelve (0.05%) were anti-HBc(+) and HBV DNA(+) but HBsAg(−); they were considered to have occult infection. Further, 664 (2.91%) were HBsAg(−) but anti-HBc(+), indicating chronic or resolving infection. HBV prevalence increased significantly from 2011 to 2012, increased marginally till 2013, and showed a decreasing trend from 2013 (P>0.05).

Conclusions

The five-year prevalence of HBV infection among blood donors in the Eastern Province of Saudi Arabia (3.24%) is lower than that reported for other regions in the country. The occult HBV infection rate of 0.05% emphasizes the importance of NATs in isolating potential infectious blood units.

]]>
<![CDATA[Planned Transfusion of D-Positive Blood Components in an Asia Type DEL Patient: Proposed Modification of the Korean National Guidelines for Blood Transfusion]]> https://www.researchpad.co/article/5c354b02d5eed0c484dc2d33 ]]> <![CDATA[Predicting peripartum blood transfusion in women undergoing cesarean delivery: A risk prediction model]]> https://www.researchpad.co/article/5c1d5b51d5eed0c4846eb57d

Objective

There has been an appreciable rise in postpartum hemorrhage requiring blood transfusions in the United States. Our objective is to better define patients at greatest risk for peripartum transfusion at the time of cesarean in order to identify cases for early intervention and monitoring.

Methods

Our study is a secondary analysis of a retrospective cohort study. Cases of intraoperative and immediate postpartum blood transfusion among women undergoing cesarean delivery were identified. Multivariable logistic regression models were used to identify antepartum and intrapartum risk factors that were independently associated with blood transfusion. A risk calculator was then developed to predict the need for transfusion.

Results

Of 56,967 women, 1488 (2.6%) required any blood transfusion. The strongest risk factors for peripartum blood transfusion included anemia (odds ratio [OR] 3.7, 95% CI 3.3–4.3), abruption on presentation (OR 3.3, CI 2.6–4.1), general anesthesia (OR 5.2, CI 4.4–6.1) and abnormal placentation (OR 92.0, CI 57.4–147.6). An antepartum (model 1) and combined antepartum plus intrapartum risk model (model 2) were developed (model 1 AUC = 0.77, model 2 AUC = 0.83) and internally validated.

Conclusions

Among women who required cesarean delivery, we were able to identify risk factors which predispose women to peripartum blood transfusion and developed a prediction model with good discrimination.

]]>
<![CDATA[The efficacy and safety of pharmacologic thromboprophylaxis following caesarean section: A systematic review and meta-analysis]]> https://www.researchpad.co/article/5c181390d5eed0c4847753ff

Objective

Our purpose is to evaluate the efficacy and safety of pharmacologic thromboprophylaxis following caesarean section (CS).

Methods

We searched PubMed, Embase, and the Cochrane Library. Then the systematic review was performed by analysing studies that met the eligibility criteria.

Results

Seven studies with 1243 participants were included, including 6 RCTs and 1 prospective cohort. Results from the meta-analysis showed that low molecular weight heparin (LMWH) was associated with no obvious decrease in the risk of thrombus compared with UHF and negative control. However, LMWH was observed to be associated with a definite increase in the risk of bleeding or haematomas in comparison to negative control (RR: 8.47, CI: 1.52–47.11).

Conclusion

According to current evidences, the efficacy of pharmacologic thromboprophylaxis which increases the risk of bleeding or hematomas remains controversial.

]]>
<![CDATA[Iron chelating properties of Eltrombopag: Investigating its role in thalassemia-induced osteoporosis]]> https://www.researchpad.co/article/5c0ed779d5eed0c484f141f9

Chronic blood transfusions are responsible to cause iron overload, which leads to several complications to end organs and osteoporosis. Iron chelation is needed to remove iron excess and to contain bone-mass loss. Deferasirox is the most recent oral iron chelator that prevents transfusion related iron overload complications. Recently Eltrombopag (ELT) iron chelating properties are emerging. ELT is an agonist at Thrombopoietin receptor, used in treatment of thrombocytopenia. We tested ELT and Deferasirox in iron overloaded osteoclasts from thalassemic patients and donors measuring intracellular iron, TRAP expression and osteoclast activity. We confirmed ELT iron chelation capacity also in bone tissue and a synergic effect when used with Deferasirox. Moreover, having demonstrated its effects on osteoclast activity, we suggest for the first time that ELT could ameliorate bone tissue’s health reducing bone mass loss.

]]>
<![CDATA[Improved clinical management but not patient outcome in women with postpartum haemorrhage—An observational study of practical obstetric team training]]> https://www.researchpad.co/article/5bb530e440307c24312bb0b8

Objective

Postpartum haemorrhage (PPH) is the most common obstetric emergency. A well-established postpartum haemorrhage protocol in the labour ward is crucial for effective treatment. The aim of the study was to investigate if practical obstetric team training improves the patient outcome and clinical management of PPH.

Setting

The practical obstetric team training (PROBE) at Linköping University Hospital, Sweden, with approximate 3000 deliveries annually, was studied between the years of 2004–2011. Each team consisted of one or two midwives, one obstetrician or one junior doctor and one nurse assistant. Emergency obstetrics cases were trained in a simulation setting. PROBE was scheduled during work hours at an interval of 1.5 years.

Population

Pre-PROBE women (N = 419 were defined as all women with vaginal birth between the years of 2004–2007 with an estimated blood loss of ≥1000 ml within the first 24 hours of delivery. Post-PROBE women (N = 483) were defined as all women with vaginal birth between the years of 2008–2011 with an estimated blood loss of ≥1000 ml within the first 24 hours of delivery. The two groups were compared regarding blood loss parameters and management variables using retrospective data from medical records.

Results

No difference was observed in estimated blood loss, haemoglobin level, blood transfusions or the incidence of postpartum haemorrhage between the two groups. Post-PROBE women had more often secured venous access (p<0.001), monitoring of vital signs (p<0.001) and received fluid resuscitation (p<0.001) compared to pre-PROBE women. The use of uterine massage was also more common among the post-PROBE women compared with the pre-PROBE women (p<0.001).

Conclusion

PROBE improved clinical management but not patient outcome in women with postpartum haemorrhage in the labour ward. These new findings may have clinical implications since they confirm that training was effective concerning the management of postpartum haemorrhage. However, there is still no clear evidence that simulation training improve patient outcome in women with PPH.

]]>
<![CDATA[Twenty-seven year surveillance of blood transfusion recipients infected with HIV-1 in Hebei Province, China]]> https://www.researchpad.co/article/5b8acdfb40307c144d0de060

We conducted an investigation of blood management in which blood transfusion recipients underwent molecular biological analysis, to trace the possible source of HIV infection. Epidemiological investigation was carried out among HIV-infected individuals. Blood transfusion recipients infected with HIV were tracked for the date of transfusion, reason for transfusion, hospital where transfusion was received, source of blood, components of transfusion, number of transfusions, and transfusion volume. A total of 285 blood transfusion recipients infected with HIV-1 were detected in Hebei over the study period, with 42.81% (122/285) detected through clinical diagnostic testing. These cases showed a concentrated distribution in southern Hebei, with local outbreak characteristics. A census of the population in Shahe County, which had a high concentration of cases, revealed that recipients of blood transfusions had an HIV infection rate of 15.54% (92/592). Post-transfusion infection frequently occurred among blood transfusion recipients at township medical institutions, with a peak in 1995. Owing to late detection of HIV infection among blood transfusion recipients, the rates of spousal transmission and mother-to-child transmission reached 20.87% and 28.05%, respectively. Around 1995, community medical institutions did not screen for HIV antibodies among paid blood donors, which was an important cause of the outbreak of HIV-1 infection among blood transfusion recipients. Our findings indicate that cases of blood transfusion-related infection decreased rapidly with gradual improvement in the HIV screening system for blood donors that began in 1995, particularly after full implementation of HIV nucleic acid testing of volunteer blood donors was begun in 2015.

]]>
<![CDATA[Structural Requirements for the Procoagulant Activity of Nucleic Acids]]> https://www.researchpad.co/article/5989db21ab0ee8fa60bcf40a

Nucleic acids, especially extracellular RNA, are exposed following tissue- or vessel damage and have previously been shown to activate the intrinsic blood coagulation pathway in vitro and in vivo. Yet, no information on structural requirements for the procoagulant activity of nucleic acids is available. A comparison of linear and hairpin-forming RNA- and DNA-oligomers revealed that all tested oligomers forming a stable hairpin structure were protected from degradation in human plasma. In contrast to linear nucleic acids, hairpin forming compounds demonstrated highest procoagulant activities based on the analysis of clotting time in human plasma and in a prekallikrein activation assay. Moreover, the procoagulant activities of the DNA-oligomers correlated well with their binding affinity to high molecular weight kininogen, whereas the binding affinity of all tested oligomers to prekallikrein was low. Furthermore, four DNA-aptamers directed against thrombin, activated protein C, vascular endothelial growth factor and nucleolin as well as the naturally occurring small nucleolar RNA U6snRNA were identified as effective cofactors for prekallikrein auto-activation. Together, we conclude that hairpin-forming nucleic acids are most effective in promoting procoagulant activities, largely mediated by their specific binding to kininogen. Thus, in vivo application of therapeutic nucleic acids like aptamers might have undesired prothrombotic or proinflammatory side effects.

]]>
<![CDATA[Still Searching for a Suitable Molecular Test to Detect Hidden Plasmodium Infection: A Proposal for Blood Donor Screening in Brazil]]> https://www.researchpad.co/article/5989dac5ab0ee8fa60bb24e0

Background

Efforts have been made to establish sensitive diagnostic tools for malaria screening in blood banks in order to detect malaria asymptomatic carriers. Microscopy, the malaria reference test in Brazil, is time consuming and its sensitivity depends on microscopist experience. Although molecular tools are available, some aspects need to be considered for large-scale screening: accuracy and robustness for detecting low parasitemia, affordability for application to large number of samples and flexibility to perform on individual or pooled samples.

Methodology

In this retrospective study, we evaluated four molecular assays for detection of malaria parasites in a set of 56 samples previously evaluated by expert microscopy. In addition, we evaluated the effect of pooling samples on the sensitivity and specificity of the molecular assays. A well-characterized cultured sample with 1 parasite/μL was included in all the tests evaluated. DNA was extracted with QIAamp DNA Blood Mini Kit and eluted in 50 μL to concentrate the DNA. Pools were assembled with 10 samples each. Molecular protocols targeting 18S rRNA, included one qPCR genus specific (Lima-genus), one duplex qPCR genus/Pf (PET-genus, PET-Pf) and one duplex qPCR specie-specific (Rougemont: Roug-Pf/Pv and Roug-Pm/Po). Additionally a nested PCR protocol specie-specific was used (Snou-Pf, Snou-Pv, Snou-Pm and Snou-Po).

Results

The limit of detection was 3.5 p/μL and 0.35p/μl for the PET-genus and Lima-genus assays, respectively. Considering the positive (n = 13) and negative (n = 39) unpooled individual samples according to microscopy, the sensitivity of the two genus qPCR assays was 76.9% (Lima-genus) and 72.7% (PET-genus). The Lima-genus and PET-genus showed both sensitivity of 86.7% in the pooled samples. The genus protocols yielded similar results (Kappa value of 1.000) in both individual and pooled samples.

Conclusions

Efforts should be made to improve performance of molecular tests to enable the detection of low-density parasitemia if these tests are to be utilized for blood transfusion screening.

]]>