ResearchPad - clinical-laboratory-sciences https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Prevalence of anti-hepatitis E virus IgG antibodies in sera from hemodialysis patients in Tripoli, Lebanon]]> https://www.researchpad.co/article/elastic_article_15713 Hepatitis E virus (HEV) is an important global public health concern. Several studies reported a higher HEV prevalence in patients undergoing regular hemodialysis (HD). In Lebanon, the epidemiology of HEV among HD patients has never been investigated previously. In this study, we examine the seroprevalence of HEV infection among 171 HD patients recruited from three hospital dialysis units in Tripoli, North Lebanon. Prevalence of anti-HEV IgG antibodies was evaluated in participant’s sera using a commercial enzyme-linked immunosorbent assay (ELISA). The association of socio-demographic and clinical parameters with HEV infection in patients was also evaluated. Overall, 96 women and 75 men were enrolled in this study. Anti-HEV IgG antibodies were found positive in 37/171 HD patients showing a positivity rate of 21.63%. Among all examined variables, only the age of patients was significantly associated with seropositivity (P = 0.001). This first epidemiological study reveals a high seroprevalence of HEV infection among Lebanese HD patients. However, further evaluations that enroll larger samples and include control groups are required to identify exact causative factors of the important seropositivity rate in this population.

]]>
<![CDATA[Fingerprint evidence for the division of labour and learning pottery-making at Early Bronze Age Tell eṣ-Ṣâfi/Gath, Israel]]> https://www.researchpad.co/article/N5152a5b5-1b3f-41e8-b706-9ccd50f6a496

The organization of craft production has long been a marker for broader social, economic and political changes that accompanied urbanism. The identity of producers who comprised production groups, communities, or workshops is out of reach using conventional archaeological data. There has been some success using epidermal prints on artefacts to identify the age and sex of producers. However, forensic research indicates that a combination of ridge breadth and ridge density would best identify the age and sex of individuals. To this end, we combine mean ridge breadth (MRB) and mean ridge density (MRD) to distinguish the age and sex of 112 fingerprints on Early Bronze Age (EB) III pottery from the early urban neighbourhood at Tell eṣ-Ṣâfi/Gath, Israel, dating to a 100 year time span. Our analysis accounts for the shrinkage of calcareous fabrics used to make six type of vessels, applies a modified version of the Kamp et al. regression equation to the MRB for each individual print, and infers sex by correlating MRD data to appropriate modern reference populations. When the results are combined, our analyses indicate that most fingerprints were made by adult and young males and the remainder by adult and young females. Children’s prints are in evidence but only occur on handles. Multiple prints of different age and sex on the same vessels suggest they were impressed during the training of young potters. Production appears dominated by adult and young males working alone, together, and in cooperation with adult and/or young females. Vessels with prints exclusively by females of any age are rare. This male dominant cooperative labour pattern contrasts recent studies showing that adult women primarily made Neolithic figurines in Anatolia, and more females than males were making pottery prior to the rise of city-states in northern Mesopotamia.

]]>
<![CDATA[Urea-mediated dissociation alleviate the false-positive Treponema pallidum-specific antibodies detected by ELISA]]> https://www.researchpad.co/article/5c8823e2d5eed0c484639234

The serological detection of antibodies to Treponema pallidum is essential to the diagnosis of syphilis. However, for the presence of cross-reaction, the specific antibody tests [e.g., enzyme-linked immunosorbent assay (ELISA)] always have false-positive results. In this study, we derived and validated the dissociation of urea in an attempt to alleviate the situation of false-positive antibodies to T. pallidum detected by ELISA. Six serum samples that were false-positive antibodies to T. pallidum detected by ELISA, and 16 control serum samples (8 sera positive for both specific IgG and IgM, and 8 IgG-positive and IgM-negative sera) were collected to select the appropriate dissociated concentration and time of urea. Our goal was to establish improved an ELISA method based on the original detection system of ELISA. The sensitivity of the improved ELISA was evaluated by 275 serum samples with class IgM-positive antibodies to T. pallidum. At 6 mol/L with 10 minutes dissociation of urea, 6 samples with false-positive antibodies to T. pallidum were converted to negative, and compared with true-positive antibodies to T. pallidum. The sensitivity of the improved ELISA was 100% by detecting the class IgM-positive antibodies to T. pallidum in sera of patients with syphilis. Considering the importance at the diagnosis of syphilis, antibodies to T. pallidum in serum samples should be retested by the improved ELISA method to avoid false-positive results.

]]>
<![CDATA[Single-center retrospective study of the effectiveness and toxicity of the oral iron chelating drugs deferiprone and deferasirox]]> https://www.researchpad.co/article/5c9900fdd5eed0c484b95e7f

Background

Iron overload, resulting from blood transfusions in patients with chronic anemias, has historically been controlled with regular deferoxamine, but its parenteral requirement encouraged studies of orally-active agents, including deferasirox and deferiprone. Deferasirox, licensed by the US Food and Drug Administration in 2005 based upon the results of randomized controlled trials, is now first-line therapy worldwide. In contrast, early investigator-initiated trials of deferiprone were prematurely terminated after investigators raised safety concerns. The FDA declined market approval of deferiprone; years later, it licensed the drug as “last resort” therapy, to be prescribed only if first-line drugs had failed. We undertook to evaluate the long-term effectiveness and toxicities of deferiprone and deferasirox in one transfusion clinic.

Methods and findings

Under an IRB-approved study, we retrospectively inspected the electronic medical records of consented iron-loaded patients managed between 2009 and 2015 at The University Health Network (UHN), Toronto. We compared changes in liver and heart iron, adverse effects and other outcomes, in patients treated with deferiprone or deferasirox.

Results

Although deferiprone was unlicensed in Canada, one-third (n = 41) of locally-transfused patients had been switched from first-line, licensed therapies (deferoxamine or deferasirox) to regimens of unlicensed deferiprone. The primary endpoint of monitoring in iron overload, hepatic iron concentration (HIC), increased (worsened) during deferiprone monotherapy (mean 10±2–18±2 mg/g; p < 0.0003), exceeding the threshold for life-threatening complications (15 mg iron/g liver) in 50% patients. During deferasirox monotherapy, mean HIC decreased (improved) (11±1–6±1 mg/g; p < 0.0001). Follow-up HICs were significantly different following deferiprone and deferasirox monotherapies (p < 0.0000002). Addition of low-dose deferoxamine (<40 mg/kg/day) to deferiprone did not result in reductions of HIC to <15 mg/g (baseline 20±4 mg/g; follow-up, 18±4 mg/g; p < 0.2) or in reduction in the proportion of patients with HIC exceeding 15 mg/g (p < 0.2). During deferiprone exposure, new diabetes mellitus, a recognized consequence of inadequate iron control, was diagnosed in 17% patients, most of whom had sustained HICs exceeding 15 mg/g for years; one woman died after 13 months of a regimen of deferiprone and low-dose deferasirox. During deferiprone exposure, serum ALT increased over baseline in 65% patients. Mean serum ALT increased 6.6-fold (p < 0.001) often persisting for years. During deferasirox exposure, mean ALT was unchanged (p < 0.84). No significant differences between treatment groups were observed in the proportions of patients estimated to have elevated cardiac iron.

Conclusions

Deferiprone showed ineffectiveness and significant toxicity in most patients. Combination with low doses of first-line therapies did not improve the effectiveness of deferiprone. Exposure to deferiprone, over six years while the drug was unlicensed, in the face of ineffectiveness and serious toxicities, demands review of the standards of local medical practice. The limited scope of regulatory approval of deferiprone, worldwide, should restrict its exposure to the few patients genuinely unable to tolerate the two effective, first-line therapies.

]]>
<![CDATA[Expected total thyroxine (TT4) concentrations and outlier values in 531,765 cats in the United States (2014–2015)]]> https://www.researchpad.co/article/5c89778cd5eed0c4847d2f8c

Background

Levels exceeding the standard reference interval (RI) for total thyroxine (TT4) concentrations are diagnostic for hyperthyroidism, however some hyperthyroid cats have TT4 values within the RI. Determining outlier TT4 concentrations should aid practitioners in identification of hyperthyroidism. The objective of this study was to determine the expected distribution of TT4 concentration using a large population of cats (531,765) of unknown health status to identify unexpected TT4 concentrations (outlier), and determine whether this concentration changes with age.

Methodology/Principle findings

This study is a population-based, retrospective study evaluating an electronic database of laboratory results to identify unique TT4 measurement between January 2014 and July 2015. An expected distribution of TT4 concentrations was determined using a large population of cats (531,765) of unknown health status, and this in turn was used to identify unexpected TT4 concentrations (outlier) and determine whether this concentration changes with age. All cats between the age of 1 and 9 years (n = 141,294) had the same expected distribution of TT4 concentration (0.5–3.5ug/dL), and cats with a TT4 value >3.5ug/dL were determined to be unexpected outliers. There was a steep and progressive rise in both the total number and percentage of statistical outliers in the feline population as a function of age. The greatest acceleration in the percentage of outliers occurred between the age of 7 and 14 years, which was up to 4.6 times the rate seen between the age of 3 and 7 years.

Conclusions

TT4 concentrations >3.5ug/dL represent outliers from the expected distribution of TT4 concentration. Furthermore, age has a strong influence on the proportion of cats. These findings suggest that patients with TT4 concentrations >3.5ug/dL should be more closely evaluated for hyperthyroidism, particularly between the ages of 7 and 14 years. This finding may aid clinicians in earlier identification of hyperthyroidism in at-risk patients.

]]>
<![CDATA[Comparison of the new fully automated extraction platform eMAG to the MagNA PURE 96 and the well-established easyMAG for detection of common human respiratory viruses]]> https://www.researchpad.co/article/5c75ac8ad5eed0c484d089f7

Respiratory viral infections constitute the majority of samples tested in the clinical virology laboratory during the winter season, and are mainly diagnosed using molecular assays, namely real-time PCR (qPCR). Therefore, a high-quality extraction process is critical for successful, reliable and sensitive qPCR results. Here we aimed to evaluate the performance of the newly launched eMAG compared to the fully automated MagNA PURE 96 (Roche, Germany) and to the semi-automated easyMAG (bioMerieux, France) extraction platforms. For this analysis, we assessed and compared the analytic and clinical performance of the three platforms, using 262 archived respiratory samples positive or negative to common viruses regularly examined in our laboratory (influenza A, B, H1N1pdm, Respiratory Syncytial Virus (RSV), human Metapneumovirus (hMPV), parainfluenza-3, adenovirus and negative samples). In addition, quantitated virus controls were used to determine the limit of detection of each extraction method.

In all categories tested, eMAG results were comparable to those of the easyMAG and MagNa PURE 96, highly sensitive for all viruses and over 98% clinical specificity and sensitivity for all viruses tested. Together with its high level of automation, the bioMerieux eMAG is a high-quality extraction platform enabling effective molecular analysis and is mostly suitable for medium-sized laboratories.

]]>
<![CDATA[Do professional facial image comparison training courses work?]]> https://www.researchpad.co/article/5c6dca04d5eed0c48452a6c3

Facial image comparison practitioners compare images of unfamiliar faces and decide whether or not they show the same person. Given the importance of these decisions for national security and criminal investigations, practitioners attend training courses to improve their face identification ability. However, these courses have not been empirically validated so it is unknown if they improve accuracy. Here, we review the content of eleven professional training courses offered to staff at national security, police, intelligence, passport issuance, immigration and border control agencies around the world. All reviewed courses include basic training in facial anatomy and prescribe facial feature (or ‘morphological’) comparison. Next, we evaluate the effectiveness of four representative courses by comparing face identification accuracy before and after training in novices (n = 152) and practitioners (n = 236). We find very strong evidence that short (1-hour and half-day) professional training courses do not improve identification accuracy, despite 93% of trainees believing their performance had improved. We find some evidence of improvement in a 3-day training course designed to introduce trainees to the unique feature-by-feature comparison strategy used by facial examiners in forensic settings. However, observed improvements are small, inconsistent across tests, and training did not produce the qualitative changes associated with examiners’ expertise. Future research should test the benefits of longer examination-focussed training courses and incorporate longitudinal approaches to track improvements caused by mentoring and deliberate practice. In the absence of evidence that training is effective, we advise agencies to explore alternative evidence-based strategies for improving the accuracy of face identification decisions.

]]>
<![CDATA[Platelet count abnormalities and peri-operative outcomes in adults undergoing elective, non-cardiac surgery]]> https://www.researchpad.co/article/5c6b2682d5eed0c484289bce

Background

Anemia and transfusion of blood in the peri-operative period have been shown to be associated with increased morbidity and mortality across a wide variety of non-cardiac surgeries. While tests of coagulation, including the platelet count, have frequently been used to identify patients with an increased risk of peri-operative bleeding, results have been equivocal. The aim of this study was to assess the effect of platelet level on outcomes in patients undergoing elective surgery.

Materials and methods

Retrospective cohort analysis of prospectively-collected clinical data from American College of Surgeons National Surgical Quality Improvement Program (NSQIP) between 2006–2016.

Results

We identified 3,884,400 adult patients who underwent elective, non-cardiac surgery from 2006–2016 at hospitals participating in NSQIP, a prospectively-collected, national clinical database with established reproducibility and validity. After controlling for all peri- and intraoperative factors by matching on propensity scores, patients with all levels of thrombocytopenia or thrombocytosis had higher odds for perioperative transfusion. All levels of thrombocytopenia were associated with higher mortality, but there was no association with complications or other morbidity after matching. On the other hand, thrombocytosis was not associated with mortality; but odds for postoperative complications and 30-day return to the operating room remained slightly increased after matching.

Conclusions

These findings may guide surgeons in the appropriate use and appreciation of the utility of pre-operative screening of the platelet count prior to an elective, non-cardiac surgery.

]]>
<![CDATA[Utility of rabies neutralizing antibody detection in cerebrospinal fluid and serum for ante-mortem diagnosis of human rabies]]> https://www.researchpad.co/article/5c59fef7d5eed0c48413585e

Background

Early ante-mortem laboratory confirmation of human rabies is essential to aid patient management and institute public health measures. Few studies have highlighted the diagnostic value of antibody detection in CSF/serum in rabies, and its utility is usually undermined owing to the late seroconversion and short survival in infected patients. This study was undertaken to examine the ante-mortem diagnostic utility and prognostic value of antibody detection by rapid fluorescent focus inhibition test (RFFIT) in cerebrospinal fluid (CSF)/serum samples received from clinically suspected human rabies cases from January 2015 to December 2017.

Methodology/Principal findings

Samples collected ante-mortem and post-mortem from 130 and 6 patients with clinically suspected rabies respectively, were received in the laboratory during the study period. Ante-mortem laboratory confirmation was achieved in 55/130 (42.3%) cases. Real time PCR for detection of viral nucleic acid performed on saliva, nuchal skin, brain tissue and CSF samples could confirm the diagnosis in 15 (27.2%) of the 55 laboratory confirmed cases. Ante-mortem diagnosis could be achieved by RFFIT (in CSF and/or serum) in 45 (34.6%) of the 130 clinically suspected cases, accounting for 81.8% of the total 55 laboratory confirmed cases. The sensitivity of CSF RFFIT increased with the day of sample collection (post-onset of symptoms) and was found to be 100% after 12 days of illness. Patients who had received prior vaccination had an increased probability of a positive RFFIT and negative PCR result. Patients who were positive by RFFIT alone at initial diagnosis had longer survival (albeit with neurological sequelae) than patients who were positive by PCR alone or both RFFIT and PCR.

Conclusions/Significance

Detection of antibodies in the CSF/serum is a valuable ante-mortem diagnostic tool in human rabies, especially in patients who survive beyond a week. It was also found to have a limited role as a prognostic marker to predict outcomes in patients.

]]>
<![CDATA[Percutaneous nephrolithotomy versus open surgery for surgical treatment of patients with staghorn stones: A systematic review and meta-analysis]]> https://www.researchpad.co/article/5c5ca31ad5eed0c48441f16c

Objectives

To compare the efficacy and safety of percutaneous nephrolithotomy (PCNL) and open surgery (OS) for surgical treatment of patients with staghorn stones based on published literatures.

Materials and methods

A comprehensive literature search of Pubmed, Embase, CNKI and Cochrane Library was conducted to identify studies comparing outcomes of PCNL and OS for treating patients with staghorn stones up to Jan 2018.

Results

There was no significant difference in final-SFR between PCNL and OS (odds ratio[OR]: 1.17; 95% confidence interval [CI]: 0.64, 2.15; p = 0.61), while PCNL provided a significantly lower immediate-SFR compared with OS (OR: 0.29; 95% CI: 0.16, 0.51; P < 0.0001). PCNL provided significantly lower overall complication rate, shorter operative times, hospitalization times, less blood loss and blood transfusion compared with OS (OR: 0.59; 95% CI: 0.41, 0.84; P = 0.004), (weighted mean difference [WMD]: -59.01mins; 95% CI: -81.09, -36.93; p < 0.00001), (WMD: -5.77days; 95% CI: -7.80, -3.74; p < 0.00001), (WMD: -138.29ml; 95% CI: -244.98, -31.6; p = 0.01) and (OR: 0.44; 95% CI: 0.29, 0.68; P = 0.00002), respectively. No significant differences were found in minor complications (Clavien I-II) (OR: 0.72; 95% CI: 0.47, 1.09; p = 0.12) and major complications (Clavien III-V) (OR: 0.5; 95% CI: 0.23, 1.08; P = 0.08). In subgroup analysis, there were no significant differences for overall complications and operative times between mini-PCNL and OS. In sensitivity analysis, there was no significant difference for overall complications between PCNL and OS.

Conclusion

Our analysis suggested that standard PCNL turns out to be a safe and feasible alternative for patients with staghorn stones compared to OS or mini-PCNL. Because of the inherent limitations of the included studies, further large sample, prospective, multi-centric and randomized control trials should be undertaken to confirm our findings.

]]>
<![CDATA[Automatic classification of human facial features based on their appearance]]> https://www.researchpad.co/article/5c59ff05d5eed0c484135990

Classification or typology systems used to categorize different human body parts have existed for many years. Nevertheless, there are very few taxonomies of facial features. Ergonomics, forensic anthropology, crime prevention or new human-machine interaction systems and online activities, like e-commerce, e-learning, games, dating or social networks, are fields in which classifications of facial features are useful, for example, to create digital interlocutors that optimize the interactions between human and machines. However, classifying isolated facial features is difficult for human observers. Previous works reported low inter-observer and intra-observer agreement in the evaluation of facial features. This work presents a computer-based procedure to automatically classify facial features based on their global appearance. This procedure deals with the difficulties associated with classifying features using judgements from human observers, and facilitates the development of taxonomies of facial features. Taxonomies obtained through this procedure are presented for eyes, mouths and noses.

]]>
<![CDATA[A method for automatic forensic facial reconstruction based on dense statistics of soft tissue thickness]]> https://www.researchpad.co/article/5c521825d5eed0c484797560

In this paper, we present a method for automated estimation of a human face given a skull remain. Our proposed method is based on three statistical models. A volumetric (tetrahedral) skull model encoding the variations of different skulls, a surface head model encoding the head variations, and a dense statistic of facial soft tissue thickness (FSTT). All data are automatically derived from computed tomography (CT) head scans and optical face scans. In order to obtain a proper dense FSTT statistic, we register a skull model to each skull extracted from a CT scan and determine the FSTT value for each vertex of the skull model towards the associated extracted skin surface. The FSTT values at predefined landmarks from our statistic are well in agreement with data from the literature. To recover a face from a skull remain, we first fit our skull model to the given skull. Next, we generate spheres with radius of the respective FSTT value obtained from our statistic at each vertex of the registered skull. Finally, we fit a head model to the union of all spheres. The proposed automated method enables a probabilistic face-estimation that facilitates forensic recovery even from incomplete skull remains. The FSTT statistic allows the generation of plausible head variants, which can be adjusted intuitively using principal component analysis. We validate our face recovery process using an anonymized head CT scan. The estimation generated from the given skull visually compares well with the skin surface extracted from the CT scan itself.

]]>
<![CDATA[Decreased total iron binding capacity upon intensive care unit admission predicts red blood cell transfusion in critically ill patients]]> https://www.researchpad.co/article/5c521820d5eed0c484797475

Introduction

Red blood cell (RBC) transfusion is associated with poor clinical outcome in critically ill patients. We investigated the predictive value of biomarkers on intensive care units (ICU) admission for RBC transfusion within 28 days.

Methods

Critically ill patients (n = 175) who admitted to our ICU with organ dysfunction and an expected stay of ≥ 48 hours, without hemorrhage, were prospectively studied (derivation cohort, n = 121; validation cohort, n = 54). Serum levels of 12 biomarkers (hemoglobin, creatinine, albumin, interleukin-6 [IL-6], erythropoietin, Fe, total iron binding capacity [TIBC], transferrin, ferritin, transferrin saturation, folate, and vitamin B12) were measured upon ICU admission, days 7, 14, 21 and 28.

Results

Among the 12 biomarkers measured upon ICU admission, levels of hemoglobin, albumin, IL-6, TIBC, transferrin and ferritin were statistically different between transfusion and non-transfusion group. Of 6 biomarkers, TIBC upon ICU admission had the highest area under the curve value (0.835 [95% confidence interval] = 0.765–0.906) for predicting RBC transfusion (cut-off value = 234.5 μg/dL; sensitivity = 0.906, specificity = 0.632). This result was confirmed in validation cohort, whose sensitivity and specificity were 0.888 and 0.694, respectively. Measurement of these biomarkers every seven days revealed that albumin, TIBC and transferrin were statistically different between groups throughout hospitalization until 28 days. In validation cohort, patients in the transfusion group had significantly higher serum hepcidin levels than those in the non-transfusion group (P = 0.004). In addition, joint analysis across derivation and validation cohorts revealed that the serum IL-6 levels were higher in the transfusion group (P = 0.0014).

Conclusion

Decreased TIBC upon ICU admission has high predictive value for RBC transfusion unrelated to hemorrhage within 28 days.

]]>
<![CDATA[Adherence to clinical guidelines is associated with reduced inpatient mortality among children with severe anemia in Ugandan hospitals]]> https://www.researchpad.co/article/5c644881d5eed0c484c2e808

Background

In resource limited settings, there is variability in the level of adherence to clinical guidelines in the inpatient management of children with common conditions like severe anemia. However, there is limited data on the effect of adherence to clinical guidelines on inpatient mortality in children managed for severe anemia.

Methods

We analyzed data from an uncontrolled before and after in-service training intervention to improve quality of care in Lira and Jinja regional referral hospitals in Uganda. Inpatient records of children aged 0 to 5 years managed as cases of ‘severe anemia (SA)’ were reviewed to ascertain adherence to clinical guidelines and compare inpatient deaths in SA children managed versus those not managed according to clinical guidelines. Logistic regression analysis was conducted to evaluate the relationship between clinical care factors and inpatient deaths amongst patients managed for SA.

Results

A total of 1,131 children were assigned a clinical diagnosis of ‘severe anemia’ in the two hospitals. There was improvement in the level of care after the in-service training intervention with more children being managed according to clinical guidelines compared to the period before, 218/510 (42.7%) vs 158/621 (25.4%) (p < 0.001). Overall, children managed according to clinical guidelines had reduced risk of inpatient mortality compared to those not managed according to clinical guidelines, [OR 0.28, (95%, CI 0.14, 0.55), p = 0.001]. Clinical care factors associated with decreased risk of inpatient death included, having pre-transfusion hemoglobin done to confirm diagnosis [OR 0.5; 95% CI 0.29, 0.87], a co-morbid diagnosis of severe malaria [OR 0.4; 95% CI 0.25, 0.76], and being reviewed after admission by a clinician [OR 0.3; 95% CI 0.18, 0.59], while a co-morbid diagnosis of severe acute malnutrition was associated with increased risk of inpatient death [OR 4.2; 95% CI 2.15, 8.22].

Conclusion

Children with suspected SA who are managed according to clinical guidelines have lower in-hospital mortality than those not managed according to the guidelines. Efforts to reduce inpatient mortality in SA children in resource-limited settings should focus on training and supporting health workers to adhere to clinical guidelines.

]]>
<![CDATA[Trauma induced acute kidney injury]]> https://www.researchpad.co/article/5c57e693d5eed0c484ef37fb

Background

Injured patients are at risk of developing acute kidney injury (AKI), which is associated with increased morbidity and mortality. The aim of this study is to describe the incidence, timing, and severity of AKI in a large trauma population, identify risk factors for AKI, and report mortality outcomes.

Methods

A prospective observational study of injured adults, who met local criteria for trauma team activation, and were admitted to a UK Major Trauma Centre. AKI was defined by the Kidney Disease Improving Global Outcomes (KDIGO) criteria. Multivariable logistic regression and Cox proportional hazard modelling was used to analyse parameters associated with AKI and mortality.

Results

Of the 1410 patients enrolled in the study, 178 (12.6%) developed AKI. Age; injury severity score (ISS); admission systolic blood pressure, lactate and serum creatinine; units of Packed Red Blood Cells transfused in first 24 hours and administration of nephrotoxic therapy were identified as independent risk factors for the development of AKI. Patients that developed AKI had significantly higher mortality than those with normal renal function (47/178 [26.4%] versus 128/1232 [10.4%]; OR 3.09 [2.12 to 4.53]; p<0.0001). After adjusting for other clinical prognostic factors, AKI was an independent risk factor for mortality.

Conclusions

AKI is a frequent complication following trauma and is associated with prolonged hospital length of stay and increased mortality. Future research is needed to improve our ability to rapidly identify those at risk of AKI, and develop resuscitation strategies that preserve renal function in trauma patients.

]]>
<![CDATA[CD4-T cell enumeration in human immunodeficiency virus (HIV)-infected patients: A laboratory performance evaluation of Muse Auto CD4/CD4% system by World Health Organization prequalification of in vitro diagnostics]]> https://www.researchpad.co/article/5c5217f3d5eed0c484795bb9

Background

CD4 T-cell counts are still widely used to assess treatment eligibility and follow-up of HIV-infected patients. The World Health Organization (WHO) prequalification of in vitro diagnostics requested a manufacturer independent laboratory evaluation of the analytical performance at the Institute of Tropical Medicine (ITM) Antwerp, Belgium, of the Muse Auto CD4/CD4% system (Millipore), a new small capillary-flow cytometer dedicated to count absolute CD4-T cells and percentages in venous blood samples from HIV-infected patients.

Methods

Two hundred and fifty (250) patients were recruited from the HIV outpatient clinic at ITM. Accuracy and precision of CD4 T cell counting on fresh EDTA anticoagulated venous blood samples were assessed in the laboratory on a Muse Auto CD4/CD4% system. Extensive precision analyses were performed both on fresh blood and on normal and low stabilized whole blood controls. Accuracy ((bias) was assessed by comparing results from Muse CD4/CD4% to the reference (single-platform FACSCalibur). Clinical misclassification was measured at 500, 350, 200 and 100 cells/μL thresholds.

Results

Intra-assay precision was < 5%, and inter-assay was < 9%. CD4 T cell counts measured on Muse Auto CD4/CD4% System and on the reference instrument resulted in regression slopes of 0.97 for absolute counts and 1.03 for CD4 T cell percentages and a correlation coefficient of 0.99 for both. The average absolute bias as compared to the reference was negligible (4 cells/μL or 0.5%). The absolute average bias on CD4 T cell percentages was < 1%. Clinical misclassification at different CD4 T cell thresholds was small resulting in sensitivities and specificities equal or >90% at all thresholds except at 100 cells/μL (sensitivity = 87%). All samples could be analyzed as there was no repetitive rejection errors recorded.

Conclusions

The Muse Auto CD4/CD4% System performed very well on fresh venous blood samples and met all WHO acceptance criteria for analytical performance of CD4 technologies.

]]>
<![CDATA[Community perceptions of paediatric severe anaemia in Uganda]]> https://www.researchpad.co/article/5c37b796d5eed0c48449058b

Background

Severe anaemia remains a major cause of morbidity and mortality among children in sub-Saharan Africa. There is limited research on the beliefs and knowledge for paediatric severe anaemia in the region. The effect of these local beliefs and knowledge on the healthcare seeking of paediatric severe anaemia remains unknown.

Objective

To describe community perceptions of paediatric severe anaemia in Uganda.

Methods

Sixteen in-depth interviews of caregivers of children treated for severe anaemia and six focus group discussions of community members were conducted in three regions of Uganda between October and November 2017.

Results

There was no common local name used to describe paediatric severe anaemia, but the disease was understood in context as ‘having no blood’. Severe anaemia was identified to be a serious disease and the majority felt blood transfusion was the ideal treatment, but concomitant use of traditional and home remedies was also widespread. Participants articulated signs of severe pediatric anemia, such as palmar, conjunctival, and tongue pallor. Other signs described included jaundice, splenomegaly, difficulty in breathing and poor appetite. Poor feeding, malaria, splenomegaly and evil spirits were perceived to be the common causes of severe anaemia. Other causes included: human immunodeficiency virus (HIV), haemoglobinuria, fever, witchcraft, mosquito bites, and sickle cell. Splenomegaly and jaundice were perceived to be both signs and causes of severe anaemia. Severe anaemia was interpreted to be caused by evil spirits if it was either recurrent, led to sudden death, or manifested with cold extremities.

Conclusion

The community in Uganda perceived paediatric severe anaemia as a serious disease. Their understanding of the signs and perceived causes of severe anaemia to a large extent aligned with known clinical signs and biological causes. Belief in evil spirits persists and may be one obstacle to seeking timely medical care for paediatric severe anaemia.

]]>
<![CDATA[Red blood cell phenotype fidelity following glycerol cryopreservation optimized for research purposes]]> https://www.researchpad.co/article/5c26977bd5eed0c48470fa8f

Intact red blood cells (RBCs) are required for phenotypic analyses. In order to allow separation (time and location) between subject encounter and sample analysis, we developed a research-specific RBC cryopreservation protocol and assessed its impact on data fidelity for key biochemical and physiological assays. RBCs drawn from healthy volunteers were aliquotted for immediate analysis or following glycerol-based cryopreservation, thawing, and deglycerolization. RBC phenotype was assessed by (1) scanning electron microscopy (SEM) imaging and standard morphometric RBC indices, (2) osmotic fragility, (3) deformability, (4) endothelial adhesion, (5) oxygen (O2) affinity, (6) ability to regulate hypoxic vasodilation, (7) nitric oxide (NO) content, (8) metabolomic phenotyping (at steady state, tracing with [1,2,3-13C3]glucose ± oxidative challenge with superoxide thermal source; SOTS-1), as well as in vivo quantification (following human to mouse RBC xenotransfusion) of (9) blood oxygenation content mapping and flow dynamics (velocity and adhesion). Our revised glycerolization protocol (40% v/v final) resulted in >98.5% RBC recovery following freezing (-80°C) and thawing (37°C), with no difference compared to the standard reported method (40% w/v final). Full deglycerolization (>99.9% glycerol removal) of 40% v/v final samples resulted in total cumulative lysis of ~8%, compared to ~12–15% with the standard method. The post cryopreservation/deglycerolization RBC phenotype was indistinguishable from that for fresh RBCs with regard to physical RBC parameters (morphology, volume, and density), osmotic fragility, deformability, endothelial adhesivity, O2 affinity, vasoregulation, metabolomics, and flow dynamics. These results indicate that RBC cryopreservation/deglycerolization in 40% v/v glycerol final does not significantly impact RBC phenotype (compared to fresh cells).

]]>
<![CDATA[Doubling down on forensic twin studies]]> https://www.researchpad.co/article/5c25454bd5eed0c48442c452 ]]> <![CDATA[Distinguishing genetically between the germlines of male monozygotic twins]]> https://www.researchpad.co/article/5c25454dd5eed0c48442c476

Identification of the potential donor(s) of human germline-derived cells is an issue in many criminal investigations and in paternity testing. The experimental and statistical methodology necessary to work up such cases is well established but may be more challenging if monozygotic (MZ) twins are involved. Then, elaborate genome-wide searches are required for the detection of early somatic mutations that distinguish the cell sample and its donor from the other twin, usually relying upon reference material other than semen (e.g. saliva). The first such cases, involving either criminal sexual offenses or paternity disputes, have been processed successfully by Eurofins Genomics and Forensics Campus. However, when presenting the experimental results in court, common forensic genetic practice requires that the residual uncertainty about donorship is quantified in the form of a likelihood ratio (LR). Hence, we developed a general mathematical framework for LR calculation, presented herein, which allows quantification of the evidence in favour of the true donor in the respective cases, based upon observed DNA sequencing read counts.

]]>