ResearchPad - method-article Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Prediction of cell position using single-cell transcriptomic data: an iterative procedure]]> Single-cell sequencing reveals cellular heterogeneity but not cell localization. However, by combining single-cell transcriptomic data with a reference atlas of a small set of genes, it would be possible to predict the position of individual cells and reconstruct the spatial expression profile of thousands of genes reported in the single-cell study. With the purpose of developing new algorithms, the Dialogue for Reverse Engineering Assessments and Methods (DREAM) consortium organized a crowd-sourced competition known as DREAM Single Cell Transcriptomics Challenge (SCTC). Within this context, we describe here our proposed procedures for adequate reference genes selection, and an iterative procedure to predict spatial expression profile of other genes.

<![CDATA[First derivative ATR-FTIR spectroscopic method as a green tool for the quantitative determination of diclofenac sodium tablets]]>

Background: Attenuated total reflection-Fourier transform infrared (ATR-FTIR) spectroscopy is a rapid quantitative method which has been applied for pharmaceutical analysis. This work describes the utility of first derivative ATR-FTIR spectroscopy in the quantitative determination of diclofenac sodium tablets.

Methods: This analytical quantitative technique depends on a first derivative measurement of the area of infrared bands corresponding to the CO stretching range of 1550-1605 cm -1. The specificity, linearity, detection limits, precision and accuracy of the calibration curve, the infrared analysis and data manipulation were determined in order to validate the method. The statistical results were compared with other methods for the quantification of diclofenac sodium.

Results: The excipients in the commercial tablet preparation did not interfere with the assay. Excellent linearity was found for the drug concentrations in the range 0.2 – 1.5 w/w %.  (r 2= 0.9994). Precision of the method was assessed by the repeated analysis of diclofenac sodium tablets; the results obtained showed small standard deviation and relative standard deviation values, which indicates that the method is quite precise. The high percentage of recovery of diclofenac sodium tablets (99.81, 101.54 and 99.41%) demonstrate the compliance of the obtained recoveries with the pharmacopeial percent recovery. The small limit of detection and limit of quantification values (0.0528 and 0.1599 w/w %, respectively) obtained by this method indicate the high sensitivity of the method.

Conclusions: First derivative ATR-FTIR spectroscopy showed high accuracy and precision, is considered as nondestructive, green, low cost and rapid, and can be applied easily for the pharmaceutical quantitative determination of diclofenac sodium tablet formulations.

<![CDATA[Organizing and running bioinformatics hackathons within Africa: The H3ABioNet cloud computing experience]]>

The need for portable and reproducible genomics analysis pipelines is growing globally as well as in Africa, especially with the growth of collaborative projects like the Human Health and Heredity in Africa Consortium (H3Africa). The Pan-African H3Africa Bioinformatics Network (H3ABioNet) recognized the need for portable, reproducible pipelines adapted to heterogeneous computing environments, and for the nurturing of technical expertise in workflow languages and containerization technologies. Building on the network’s Standard Operating Procedures (SOPs) for common genomic analyses, H3ABioNet arranged its first Cloud Computing and Reproducible Workflows Hackathon in 2016, with the purpose of translating those SOPs into analysis pipelines able to run on heterogeneous computing environments and meeting the needs of H3Africa research projects. This paper describes the preparations for this hackathon and reflects upon the lessons learned about its impact on building the technical and scientific expertise of African researchers. The workflows developed were made publicly available in GitHub repositories and deposited as container images on

<![CDATA[Data linkage in medical science using the resource description framework: the AVERT model]]>

There is an ongoing challenge as to how best manage and understand ‘big data’ in precision medicine settings. This paper describes the potential for a Linked Data approach, using a Resource Description Framework (RDF) model, to combine multiple datasets with temporal and spatial elements of varying dimensionality. This “AVERT model” provides a framework for converting multiple standalone files of various formats, from both clinical and environmental settings, into a single data source. This data source can thereafter be queried effectively, shared with outside parties, more easily understood by multiple stakeholders using standardized vocabularies, incorporating provenance metadata and supporting temporo-spatial reasoning. The approach has further advantages in terms of data sharing, security and subsequent analysis. We use a case study relating to anti-Glomerular Basement Membrane (GBM) disease, a rare autoimmune condition, to illustrate a technical proof of concept for the AVERT model.

<![CDATA[A deeper understanding of intestinal organoid metabolism revealed by combining fluorescence lifetime imaging microscopy (FLIM) and extracellular flux analyses]]>

Stem cells and the niche in which they reside feature a complex microenvironment with tightly regulated homeostasis, cell-cell interactions and dynamic regulation of metabolism. A significant number of organoid models has been described over the last decade, yet few methodologies can enable single cell level resolution analysis of the stem cell niche metabolic demands, in real-time and without perturbing integrity. Here, we studied the redox metabolism of Lgr5-GFP intestinal organoids by two emerging microscopy approaches based on luminescence lifetime measurement – fluorescence-based FLIM for NAD(P)H, and phosphorescence-based PLIM for real-time oxygenation. We found that exposure of stem (Lgr5-GFP) and differentiated (no GFP) cells to high and low glucose concentrations resulted in measurable shifts in oxygenation and redox status. NAD(P)H-FLIM and O2-PLIM both indicated that at high ‘basal’ glucose conditions, Lgr5-GFP cells had lower activity of oxidative phosphorylation when compared with cells lacking Lgr5. However, when exposed to low (0.5 mM) glucose, stem cells utilized oxidative metabolism more dynamically than non-stem cells. The high heterogeneity of complex 3D architecture and energy production pathways of Lgr5-GFP organoids were also confirmed by the extracellular flux (XF) analysis. Our data reveals that combined analysis of NAD(P)H-FLIM and organoid oxygenation by PLIM represents promising approach for studying stem cell niche metabolism in a live readout.

<![CDATA[False positives in trans-eQTL and co-expression analyses arising from RNA-sequencing alignment errors]]>

Sequence similarity among distinct genomic regions can lead to errors in alignment of short reads from next-generation sequencing. While this is well known, the downstream consequences of misalignment have not been fully characterized.  We assessed the potential for incorrect alignment of RNA-sequencing reads to cause false positives in both gene expression quantitative trait locus (eQTL) and co-expression analyses. Trans-eQTLs identified from human RNA-sequencing studies appeared to be particularly affected by this phenomenon, even when only uniquely aligned reads are considered. Over 75\% of trans-eQTLs using a standard pipeline occurred between regions of sequence similarity and therefore could be due to alignment errors. Further, associations due to mapping errors are likely to misleadingly replicate between studies. To help address this problem, we quantified the potential for "cross-mapping'' to occur between every pair of annotated genes in the human genome. Such cross-mapping data can be used to filter or flag potential false positives in both trans-eQTL and co-expression analyses. Such filtering substantially alters the detection of significant associations and can have an impact on the assessment of false discovery rate, functional enrichment, and replication for RNA-sequencing association studies.

<![CDATA[A convenient protocol for establishing a human cell culture model of the outer retina.]]>

The retinal pigment epithelium (RPE) plays a key role in the pathogenesis of several blinding retinopathies. Alterations to RPE structure and function are reported in Age-related Macular Degeneration, Stargardt and Best disease as well as pattern dystrophies. However, the precise role of RPE cells in disease aetiology remains incompletely understood. Many studies into RPE pathobiology have utilised animal models, which only recapitulate limited disease features. Some studies are also difficult to carry out in animals as the ocular space remains largely inaccessible to powerful microscopes. In contrast, in-vitro models provide an attractive alternative to investigating pathogenic RPE changes associated with age and disease. In this article we describe the step-by-step approach required to establish an experimentally versatile in-vitro culture model of the outer retina incorporating the RPE monolayer and supportive Bruch’s membrane (BrM). We show that confluent monolayers of the spontaneously arisen human ARPE-19 cell-line cultured under optimal conditions reproduce key features of native RPE. These models can be used to study dynamic, intracellular and extracellular pathogenic changes using the latest developments in microscopy and imaging technology. We also discuss how RPE cells from human foetal and stem-cell derived sources can be incorporated alongside sophisticated BrM substitutes to replicate the aged/diseased outer retina in a dish. The work presented here will enable users to rapidly establish a realistic in-vitro model of the outer retina that is amenable to a high degree of experimental manipulation which will also serve as an attractive alternative to using animals. This in-vitro model therefore has the benefit of achieving the 3Rs objective of reducing and replacing the use of animals in research. As well as recapitulating salient structural and physiological features of native RPE, other advantages of this model include its simplicity, rapid set-up time and unlimited scope for detailed single-cell resolution and matrix studies.

<![CDATA[Establishment of a method for Lutzomyia longipalpis sand fly egg microinjection: The first step towards potential novel control strategies for leishmaniasis]]>

Leishmaniases is a group of vector-borne parasitic diseases transmitted by sand flies that affects 1.3 million people across 98 countries, with limited control strategies due to the lack of an available vaccine and the emergence of insecticide resistance.  Novel control strategies that are being explored for mosquito-borne diseases, such as  Wolbachia bacterial inhibition of pathogens and genetically modified insects (e.g. using CRISPR-Cas9 editing), rely on the ability to consistently inject eggs of the target species.  Here we present a novel method to obtain and inject preblastoderm sand fly eggs of the genus  Lutzomyia (Lu.)  longipalpis, the principle vector of zoonotic visceral leishmaniasis in South America. The procedures required to obtain sufficiently young  Lu. longipalpis colony eggs are described alongside a microinjection technique that permits rapid injection and minimal handling of small sand fly eggs post-injection. Using a strain of  Wolbachia as a ‘marker’ for successful injection, our protocol produced early generation  Wolbachia transinfected  Lu. longipalpis lines, demonstrating its potential as the first step for use in novel applied strategies for sand fly control.

<![CDATA[The Quality Sequencing Minimum (QSM): providing comprehensive, consistent, transparent next generation sequencing  data quality assurance]]>

Next generation sequencing (NGS) is routinely used in clinical genetic testing. Quality management of NGS testing is essential to ensure performance is consistently and rigorously evaluated.

Three primary metrics are used in NGS quality evaluation: depth of coverage, base quality and mapping quality. To provide consistency and transparency in the utilisation of these metrics we present the Quality Sequencing Minimum (QSM).

The QSM defines the minimum quality requirement a laboratory has selected for depth of coverage (C), base quality (B) and mapping quality (M) and can be applied per base, exon, gene or other genomic region, as appropriate. The QSM format is CX_BY(P Y)_MZ(P Z). X is the parameter threshold for C, Y the parameter threshold for B, P Y the percentage of reads that must reach Y, Z the parameter threshold for M, P Z the percentage of reads that must reach Z. The data underlying the QSM is in the BAM file, so a QSM can be easily and automatically calculated in any NGS pipeline.

We used the QSM to optimise cancer predisposition gene testing using the TruSight Cancer Panel (TSCP). We set the QSM as C50_B10(85)_M20(95). Test regions falling below the QSM were automatically flagged for review, with 100/1471 test regions QSM-flagged in multiple individuals. Supplementing these regions with 132 additional probes improved performance in 85/100. We also used the QSM to optimise testing of genes with pseudogenes such as PTEN and PMS2. In TSCP data from 960 individuals the median number of regions that passed QSM per sample was 1429 (97%).  Importantly, the QSM can be used at an individual report level to provide succinct, comprehensive quality assurance information about individual test performance.

We believe many laboratories would find the QSM useful. Furthermore, widespread adoption of the QSM would facilitate consistent, transparent reporting of genetic test performance by different laboratories.

<![CDATA[An open and transparent process to select ELIXIR Node Services as implemented by ELIXIR-UK]]>

ELIXIR is the European infrastructure established specifically for the sharing and sustainability of life science data. To provide up-to-date resources and services, ELIXIR needs to undergo a continuous process of refreshing the services provided by its national Nodes. Here we present the approach taken by ELIXIR-UK to address the advice by the ELIXIR Scientific Advisory Board that Nodes need to develop “ mechanisms to ensure that each Node continues to be representative of the Bioinformatics efforts within the country”. ELIXIR-UK put in place an open and transparent process to identify potential ELIXIR resources within the UK during late 2015 and early to mid-2016. Areas of strategic strength were identified and Expressions of Interest in these priority areas were requested from the UK community. A set of criteria were established, in discussion with the ELIXIR Hub, and prospective ELIXIR-UK resources were assessed by an independent committee set up by the Node for this purpose. Of 19 resources considered, 14 were judged to be immediately ready to be included in the UK ELIXIR Node’s portfolio. A further five were placed on the Node’s roadmap for future consideration for inclusion. ELIXIR-UK expects to repeat this process regularly to ensure its portfolio continues to reflect its community’s strengths.

<![CDATA[Using theories of change to design monitoring and evaluation of community engagement in research: experiences from a research institute in Malawi]]>

Background: Evaluation of community and public engagement in research is important to deepen understanding of how engagement works and to enhance its effectiveness. Theories of change have been recommended for evaluating community engagement, for their ability to make explicit intended outcomes and understandings of how engagement activities contribute to these outcomes. However, there are few documented examples of using theories of change for evaluation of engagement. This article reports experience of using theories of change to develop a framework for evaluating community engagement in research at a clinical research organisation in Malawi. We describe the steps used to develop theories of change, and the way theories of change were used to design data collection plans. Based on our experience, we reflect on the advantages and challenges of the theory of change approach.

Methods: The theories of change and evaluation framework were developed through a series of workshops and meetings between engagement practitioners, monitoring and evaluation staff, and researchers. We first identified goals for engagement, then used ‘so that’ chains to clarify pathways and intermediate outcomes between engagement activities and goals. Further meetings were held to refine initial theories of change, identify priority information needs, and define feasible evaluation methods.

Results: The theory of change approach had several benefits. In particular, it helped to construct an evaluation framework focused on relevant outcomes and not just activities. The process of reflecting on intended goals and pathways also helped staff to review the design of engagement activities. Challenges included practical considerations around time to consider evaluation plans among practitioners (a challenge for evaluation more generally regardless of method), and more fundamental difficulties related to identifying feasible and agreed outcomes.

Conclusions: These experiences from Malawi provide lessons for other research organisations considering use of theories of change to support evaluation of community engagement.

<![CDATA[Liquid chromatography–tandem mass spectrometry for the simultaneous quantitation of ceftriaxone, metronidazole and hydroxymetronidazole in plasma from seriously ill, severely malnourished children]]>

We have developed and validated a novel, sensitive, selective and reproducible reversed-phase high-performance liquid chromatography method coupled with electrospray ionization mass spectrometry (HPLC–ESI-MS/MS) for the simultaneous quantitation of ceftriaxone (CEF), metronidazole (MET) and hydroxymetronidazole (MET-OH) from only 50 µL of human plasma, and unbound CEF from 25 µL plasma ultra-filtrate to evaluate the effect of protein binding. Cefuroxime axetil (CEFU) was used as an internal standard (IS). The analytes were extracted by a protein precipitation procedure with acetonitrile and separated on a reversed-phase Polaris 5 C18-Analytical column using a mobile phase composed of acetonitrile containing 0.1% (v/v) formic acid and 10 mM aqueous ammonium formate pH 2.5, delivered at a flow-rate of 300 µL/min. Multiple reaction monitoring was performed in the positive ion mode using the transitions m/z555.1→ m/z396.0 (CEF), m/z172.2→ m/z 128.2 (MET), m/z188.0→ m/z125.9 (MET-OH) and m/z528.1→ m/z 364.0 (CEFU) to quantify the drugs. Calibration curves in spiked plasma and ultra-filtrate were linear ( r 2 ≥ 0.9948) from 0.4–300 µg/mL for CEF, 0.05–50 µg/mL for MET and 0.02 – 30 µg/mL for MET-OH. The intra- and inter- assay precisions were less than 9% and the mean extraction recoveries were 94.0% (CEF), 98.2% (MET), 99.6% (MET-OH) and 104.6% (CEF in ultra-filtrate); the recoveries for the IS were 93.8% (in plasma) and 97.6% (in ultra-filtrate). The validated method was successfully applied to a pharmacokinetic study of CEF, MET and MET-OH in hospitalized children with complicated severe acute malnutrition following an oral administration of MET and intravenous administration of CEF over the course of 72 hours.

<![CDATA[Studying Neonates’ Language and Memory Capacities with Functional Near-Infrared Spectroscopy]]>

The measurement of newborns’ brain hemodynamic activity has improved our understanding of early cognitive processes, in particular of language acquisition. In this paper, we describe two experimental protocols adapted to study neonates’ speech-processing capacities using functional near-infrared spectroscopy (fNIRS): the block design and the familiarization-recognition design. We review some of their benefits and disadvantages, and refer to research issues that can be explored by means of these protocols. We also illustrate the use of the two experimental designs through representative fNIRS studies that reveal specific patterns of activation of the newborn brain during speech perception, learning of repetition structures, and word recognition.

<![CDATA[How to put plant root uptake into a soil water flow model]]>

The need for improved crop water use efficiency calls for flexible modeling platforms to implement new ideas in plant root uptake and its regulation mechanisms. This paper documents the details of modifying a soil infiltration and redistribution model to include (a) dynamic root growth, (b) non-uniform root distribution and water uptake, (c) the effect of water stress on plant water uptake, and (d) soil evaporation. The paper also demonstrates strategies of using the modified model to simulate soil water dynamics and plant transpiration considering different sensitivity of plants to soil dryness and different mechanisms of root water uptake. In particular, the flexibility of simulating various degrees of compensated uptake (whereby plants tend to maintain potential transpiration under mild water stress) is emphasized. The paper also describes how to estimate unknown root distribution and rooting depth parameters by the use of a simulation-based searching method. The full documentation of the computer code will allow further applications and new development.