ResearchPad - tropical-diseases https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[New estimates of the Zika virus epidemic attack rate in Northeastern Brazil from 2015 to 2016: A modelling analysis based on Guillain-Barré Syndrome (GBS) surveillance data]]> https://www.researchpad.co/article/elastic_article_7754 The mandatory reporting of the Zika virus (ZIKV) disease began region-wide in February 2016, and it is believed that ZIKV cases could have been highly under-reported before that. Given the Guillain-Barré syndrome (GBS) is relatively well reported, the GBS surveillance data has the potential to act as a reasonably reliable proxy for inferring the true ZIKV epidemics. We developed a mathematical model incorporating weather effects to study the ZIKV-GBS epidemics and estimated the key epidemiological parameters. It was found that the attack rate of ZIKV was likely to be lower than 33% over the two epidemic waves. The risk rate from symptomatic ZIKV case to develop GBS was estimated to be approximately 0.0061%. The analysis suggests that it would be difficult for another ZIKV outbreak to appear in Northeastern Brazil in the near future.

]]>
<![CDATA[Gene Conversion Transfers the GAF-A Domain of Phosphodiesterase TbrPDEB1 to One Allele of TbrPDEB2 of Trypanosoma brucei]]> https://www.researchpad.co/article/5989da89ab0ee8fa60b9d706

Background

Chromosome 9 of Trypanosoma brucei contains two closely spaced, very similar open reading frames for cyclic nucleotide specific phosphodiesterases TbrPDEB1 and TbrPDEB2. They are separated by 2379 bp, and both code for phosphodiesterases with two GAF domains in their N-terminal moieties and a catalytic domain at the C-terminus.

Methods and Findings

The current study reveals that in the Lister427 strain of T. brucei, these two genes have undergone gene conversion, replacing the coding region for the GAF-A domain of TbrPDEB2 by the corresponding region of the upstream gene TbrPDEB1. As a consequence, these strains express two slightly different versions of TbrPDEB2. TbrPDEB2a represents the wild-type phosphodiesterase, while TbrPDEB2b represents the product of the converted gene. Earlier work on the subcellular localization of TbrPDEB1 and TbrPDEB2 had demonstrated that TbrPDEB1 is predominantly located in the flagellum, whereas TbrPDEB2 partially locates to the flagellum but largely remains in the cell body. The current findings raised the question of whether this dual localization of TbrPDEB2 may reflect the two alleles. To resolve this, TbrPDEB2 of strain STIB247 that is homozygous for TbrPDEB2a was tagged in situ, and its intracellular localization was analyzed.

Conclusions

The results obtained were very similar to those found earlier with Lister427, indicating that the dual localization of TbrPDEB2 reflects its true function and is not simply due to the presence of the two different alleles. Notably, the gene conversion event is unique for the Lister427 strain and all its derivatives. Based on this finding, a convenient PCR test has been developed that allows the stringent discrimination between Lister-derived strains that are common in many laboratories and other isolates. The technique is likely very useful to resolve questions about potential mix-ups of precious field isolates with the ubiquitous Lister strain.

]]>
<![CDATA[Gender norms and mass deworming program access in Comé, Benin: A qualitative assessment of gender-associated opportunities and challenges to achieving high mass drug administration coverage]]> https://www.researchpad.co/article/N0cbc3c85-9c5e-43fe-983c-4afc7d1b8db3

The World Health Organization’s Neglected Tropical Disease Roadmap has accelerated progress towards eliminating select neglected tropical diseases (NTDs). This momentum has catalyzed research to determine the feasibility of interrupting transmission of soil-transmitted helminths (STH) using community-wide mass drug administration (MDA). This study aims to identify potential gender-specific facilitators and barriers to accessing and participating in community-wide STH MDA, with the goal of ensuring programs are equitable and maximize the probability of interrupting STH transmission. This research was conducted prior to the launch of community-wide MDA for STH in Comé, Benin. A total of 10 focus group discussions (FGDs) were conducted separately among 40 men, 38 women, and 15 community drug distributors (CDDs). Salient themes included: both men and women believe that community-wide MDA would reduce the financial burden associated with self-treatment, particularly for low income adults. Community members believe MDA should be packaged alongside water, sanitation, and other health services. Women feel past community-wide programs have been disorganized and are concerned these distributions will be similar. Women also expressed interest in increased engagement in the implementation of future community-based public health programs. Men often did not perceive themselves to be at great risk for STH infection and did not express a high demand for treatment. Finally, the barriers discussed by CDDs generally did not align with gender-specific concerns, but rather represented concerns shared by both genders. A door-to-door distribution strategy for STH MDA is preferred by women in this study, as this platform empowers women to participate as health decision makers for their family. In addition, involving women in planning and implementation of community-wide programs may help to increase treatment coverage and compliance.

]]>
<![CDATA[Clustered micronodules as predominant manifestation on CT: A sign of active but indolently evolving pulmonary tuberculosis]]> https://www.researchpad.co/article/N48b20e2f-c3ed-4c3c-a251-c583ed3c8c8a

Objective

To investigate the prevalence, patient characteristics, and natural history of clustered micronodules (CMs) in active pulmonary tuberculosis.

Materials and methods

From January 2013 through July 2018, 833 consecutive patients with bacteriologically or polymerase chain reaction–proven active pulmonary tuberculosis were retrospectively evaluated. CMs were defined as a localized aggregation of multiple dense discrete micronodules, which primarily distributed around small airways distal to the level of the segmental bronchus: small airways surrounded by CMs maintained luminal patency and the CMs might coalesce into a larger nodule. The patients were dichotomized according to whether the predominant computed tomography (CT) abnormalities were CMs. We analyzed radiologic and pathologic findings in patients whose predominant diagnostic CT abnormalities were CMs, along with those of incidental pre-diagnostic CT scans, if available. Chi-square, McNemar, Student t-test and Wilcoxon-signed rank test were performed.

Results

CMs were the predominant CT abnormality in 2.6% of the patients (22/833, 95% CI, 1.8–4.0%) with less sputum smear-positivity (4.8% vs 31.0%; p = .010) and a similar proportion of immunocompromised status (40.9% vs 46.0%; p = .637) than those without having CMs as the predominant CT abnormality. The time interval for minimal radiologic progression was 6.4 months. The extent of CMs increased with disease progression, frequently accompanied by consolidation and small airway wall thickening. Pathologically, smaller CMs were non-caseating granulomas confined to the peribronchiolar interstitium, whereas larger CMs were caseating granulomas involving lung parenchyma. Two of the five patients with a pre-diagnostic CT scan obtained more than 50 months pre-diagnosis showed an incipient stage of CMs, in which they were small peribronchiolar nodules.

Conclusion

Active pulmonary tuberculosis manifested predominantly as CMs in 2.6% of patients, with scarce of acid-fast bacilli smear-positivity and no association with impaired host immunity. CMs indolently progressed, accompanied by consolidation and small airway wall thickening, and originated from small nodules.

]]>
<![CDATA[Identification of cholera hotspots in Zambia: A spatiotemporal analysis of cholera data from 2008 to 2017]]> https://www.researchpad.co/article/Nb4ea4681-5c5d-42bd-a1ce-642b56a34f03

The global burden of cholera is increasing, with the majority (60%) of the cases occurring in sub-Saharan Africa. In Zambia, widespread cholera outbreaks have occurred since 1977, predominantly in the capital city of Lusaka. During both the 2016 and 2018 outbreaks, the Ministry of Health implemented cholera vaccination in addition to other preventative and control measures, to stop the spread and control the outbreak. Given the limitations in vaccine availability and the logistical support required for vaccination, oral cholera vaccine (OCV) is now recommended for use in the high risk areas (“hotspots”) for cholera. Hence, the aim of this study was to identify areas with an increased risk of cholera in Zambia. Retrospective cholera case data from 2008 to 2017 was obtained from the Ministry of Health, Department of Public Health and Disease Surveillance. The Zambian Central Statistical Office provided district-level population data, socioeconomic and water, sanitation and hygiene (WaSH) indicators. To identify districts at high risk, we performed a discrete Poisson-based space-time scan statistic to account for variations in cholera risk across both space and time over a 10-year study period. A zero-inflated negative binomial regression model was employed to identify the district level risk factors for cholera. The risk map was generated by classifying the relative risk of cholera in each district, as obtained from the space-scan test statistic. In total, 34,950 cases of cholera were reported in Zambia between 2008 and 2017. Cholera cases varied spatially by year. During the study period, Lusaka District had the highest burden of cholera, with 29,080 reported cases. The space-time scan statistic identified 16 districts to be at a significantly higher risk of having cholera. The relative risk of having cholera in these districts was significantly higher and ranged from 1.25 to 78.87 times higher when compared to elsewhere in the country. Proximity to waterbodies was the only factor associated with the increased risk for cholera (P<0.05). This study provides a basis for the cholera elimination program in Zambia. Outside Lusaka, the majority of high risk districts identified were near the border with the DRC, Tanzania, Mozambique, and Zimbabwe. This suggests that cholera in Zambia may be linked to movement of people from neighboring areas of cholera endemicity. A collaborative intervention program implemented in concert with neighboring countries could be an effective strategy for elimination of cholera in Zambia, while also reducing rates at a regional level.

]]>
<![CDATA[Feasibility of establishing an HIV vaccine preparedness cohort in a population of the Uganda Police Force: Lessons learnt from a prospective study]]> https://www.researchpad.co/article/Ne890bb8a-5661-4c39-82f7-6f40a2e69675

Background

Members of uniformed armed forces are considered to be at high risk for HIV infection and have been proposed as suitable candidates for participation in HIV intervention studies. We report on the feasibility of recruitment and follow up of individuals from the community of the Uganda Police Force (UPF) for an HIV vaccine preparedness study.

Methods

HIV-negative volunteers aged 18–49 years, were identified from UPF facilities situated in Kampala and Wakiso districts through community HIV counselling and testing. Potential volunteers were referred to the study clinic for screening, enrolment and quarterly visits for one year. HIV incidence, retention rates were estimated and expressed as cases per 100 person years of observation (PYO). Rate ratios were used to determine factors associated with retention using Poisson regression models.

Results

We screened 560 to enroll 500 volunteers between November 2015 and May 2016. One HIV seroconversion occurred among 431 PYO, for an incidence rate of 0.23/100 PYO (95% confidence interval [CI]: 0.03–1.64). Overall, retention rate was 87% at one year, and this was independently associated with residence duration (compared to <1 year, 1 to 5 years adjusted rate ratio (aRR) = 1.19, 95%CI: 1.00–1.44); and >5 years aRR = 1.34, 95%CI: 0.95–1.37); absence of genital discharge in the last 3 months (aRR = 1.97, 95% CI: 1.38–2.83, absence of genital ulcers (aRR = 1.90, 95%CI: 1.26–2.87, reporting of new sexual partner in the last month (aRR = 0.57, 95%CI: 0.45–0.71, being away from home for more than two nights (aRR = 1.27, 95%CI: 1.04–1.56, compared to those who had not travelled) and absence of knowledge on HIV prevention (aRR = 2.67, 95%CI: 1.62–4.39).

Conclusions

While our study demonstrates the feasibility of recruiting and retaining individuals from the UPF for HIV research, we did observe lower than anticipated HIV incidence, perhaps because individuals at lower risk of HIV infection may have been the first to come forward to participate or participants followed HIV risk reduction measures. Our findings suggest lessons for recruitment of populations at high risk of HIV infection.

]]>
<![CDATA[Antibiotic use for Australian Aboriginal children in three remote Northern Territory communities]]> https://www.researchpad.co/article/N999fa4e6-a15c-456a-862e-2e1ce88316a9

Objective

To describe antibiotic prescription rates for Australian Aboriginal children aged <2 years living in three remote Northern Territory communities.

Design

A retrospective cohort study using electronic health records.

Setting

Three primary health care centres located in the Katherine East region.

Participants

Consent was obtained from 149 mothers to extract data from 196 child records. There were 124 children born between January 2010 and July 2014 who resided in one of the three chosen communities and had electronic health records for their first two years of life.

Main outcome measures

Antibiotic prescription rates, factors associated with antibiotic prescription and factors associated with appropriate antibiotic prescription.

Results

There were 5,675 Primary Health Care (PHC) encounters for 124 children (median 41, IQR 25.5, 64). Of the 5,675 PHC encounters, 1,542 (27%) recorded at least one infection (total 1,777) and 1,330 (23%) had at least one antibiotic prescription recorded (total 1,468). Children had a median five (IQR 2, 9) prescriptions in both their first and second year of life, with a prescription rate of 5.99/person year (95% CI 5.35, 6.63). Acute otitis media was the most common infection (683 records, 38%) and Amoxycillin was the most commonly prescribed antibiotic (797 prescriptions, 54%). Of the 1,468 recorded prescriptions, 398 (27%) had no infection recorded and 116 (8%) with an infection recorded were not aligned with local treatment guidelines.

Conclusion

Prescription rates for Australian Aboriginal children in these communities are significantly higher than that reported nationally for non-Aboriginal Australians. Prescriptions predominantly aligned with treatment guidelines in this setting where there is a high burden of infectious disease.

]]>
<![CDATA[Will COVID-19 become the next neglected tropical disease?]]> https://www.researchpad.co/article/N3ea3a36c-f707-4121-8cab-1c306bbb1993 ]]> <![CDATA[Ivermectin as an adjuvant to anti-epileptic treatment in persons with onchocerciasis-associated epilepsy: A randomized proof-of-concept clinical trial]]> https://www.researchpad.co/article/N2a703e18-6320-408f-bd4d-1f677396d877

Introduction

Recent findings from onchocerciasis-endemic foci uphold that increasing ivermectin coverage reduces the epilepsy incidence, and anecdotal evidence suggests seizure frequency reduction in persons with onchocerciasis-associated epilepsy, when treated with ivermectin. We conducted a randomized clinical trial to assess whether ivermectin treatment decreases seizure frequency.

Methods

A proof-of-concept randomized clinical trial was conducted in the Logo health zone in the Ituri province, Democratic Republic of Congo, to compare seizure frequencies in onchocerciasis-infected persons with epilepsy (PWE) randomized to one of two treatment arms: the anti-epileptic drug phenobarbital supplemented with ivermectin, versus phenobarbital alone. The primary endpoint was defined as the probability of being seizure-free at month 4. A secondary endpoint was defined as >50% reduction in seizure frequency at month 4, compared to baseline. Both endpoints were analyzed using multiple logistic regression. In longitudinal analysis, the probability of seizure freedom during the follow-up period was assessed for both treatment arms by fitting a logistic regression model using generalized estimating equations (GEE).

Results

Ninety PWE enrolled between October and November 2017 were eligible for analysis. A multiple logistic regression analysis showed a borderline association between ivermectin treatment and being seizure-free at month 4 (OR: 1.652, 95% CI 0.975–2.799; p = 0.062). There was no significant difference in the probability of experiencing >50% reduction of the seizure frequency at month 4 between the two treatment arms. Also, treatment with ivermectin did not significantly increase the odds of being seizure-free during the individual follow-up visits.

Conclusion

Whether ivermectin has an added value in reducing the frequency of seizures in PWE treated with AED remains to be determined. A larger study in persons with OAE on a stable AED regimen and in persons with recent epilepsy onset should be considered to further investigate the potential beneficial effect of ivermectin treatment in persons with OAE.

Trial registration

Registration: www.clinicaltrials.gov; NCT03052998.

]]>
<![CDATA[Crystal structures of Triosephosphate Isomerases from Taenia solium and Schistosoma mansoni provide insights for vaccine rationale and drug design against helminth parasites]]> https://www.researchpad.co/article/N340e3046-cb91-4c84-8d1b-fb2a65cf4cdb

Triosephosphate isomerases (TPIs) from Taenia solium (TsTPI) and Schistosoma mansoni (SmTPI) are potential vaccine and drug targets against cysticercosis and schistosomiasis, respectively. This is due to the dependence of parasitic helminths on glycolysis and because those proteins elicit an immune response, presumably due to their surface localization. Here we report the crystal structures of TsTPI and SmTPI in complex with 2-phosphoglyceric acid (2-PGA). Both TPIs fold into a dimeric (β-α)8 barrel in which the dimer interface consists of α-helices 2, 3, and 4, and swapping of loop 3. TPIs from parasitic helminths harbor a region of three amino acids knows as the SXD/E insert (S155 to E157 and S157 to D159 in TsTPI and SmTPI, respectively). This insert is located between α5 and β6 and is proposed to be the main TPI epitope. This region is part of a solvent-exposed 310–helix that folds into a hook-like structure. The crystal structures of TsTPI and SmTPI predicted conformational epitopes that could be used for vaccine design. Surprisingly, the epitopes corresponding to the SXD/E inserts are not the ones with the greatest immunological potential. SmTPI, but not TsTPI, habors a sole solvent exposed cysteine (SmTPI-S230) and alterations in this residue decrease catalysis. The latter suggests that thiol-conjugating agents could be used to target SmTPI. In sum, the crystal structures of SmTPI and TsTPI are a blueprint for targeted schistosomiasis and cysticercosis drug and vaccine development.

]]>
<![CDATA[A mathematical model for assessing the effectiveness of controlling relapse in Plasmodium vivax malaria endemic in the Republic of Korea]]> https://www.researchpad.co/article/Nf3d8dda1-10e2-4286-9776-07d534017a03

Malaria has persisted as an endemic near the Demilitarized Zone in the Republic of Korea since the re-emergence of Plasmodium vivax malaria in 1993. The number of patients affected by malaria has increased recently despite many controls tools, one of the reasons behind which is the relapse of malaria via liver hypnozoites. Tafenoquine, a new drug approved by the United States Food and Drug Administration in 2018, is expected to reduce the rate of relapse of malaria hypnozoites and thereby decrease the prevalence of malaria among the population. In this work, we have developed a new transmission model for Plasmodium vivax that takes into account a more realistic intrinsic distribution from existing literature to quantify the current values of relapse parameters and to evaluate the effectiveness of the anti-relapse therapy. The model is especially suitable for estimating parameters near the Demilitarized Zone in Korea, in which the disease follows a distinguishable seasonality. Results were shown that radical cure could significantly reduce the prevalence level of malaria. However, eradication would still take a long time (over 10 years) even if the high-level treatment were to persist. In addition, considering that the vector’s behavior is manipulated by the malaria parasite, relapse repression through vector control at the current level may result in a negative effect in containing the disease. We conclude that the use of effective drugs should be considered together with the increased level of the vector control to reduce malaria prevalence.

]]>
<![CDATA[Prediction model for dengue fever based on interactive effects between multiple meteorological factors in Guangdong, China (2008–2016)]]> https://www.researchpad.co/article/Nfe4e2064-ca0a-4d6d-a8b7-4f75eb296e9a

Introduction

In order to improve the prediction accuracy of dengue fever incidence, we constructed a prediction model with interactive effects between meteorological factors, based on weekly dengue fever cases in Guangdong, China from 2008 to 2016.

Methods

Dengue fever data were derived from statistical data from the China National Notifiable Infectious Disease Reporting Information System. Daily meteorological data were obtained from the China Integrated Meteorological Information Sharing System. The minimum temperature for transmission was identified using data fitting and the Ross-Macdonald model. Correlations and interactive effects were examined using Spearman’s rank correlation and multivariate analysis of variance. A probit regression model to describe the incidence of dengue fever from 2008 to 2016 and forecast the 2017 incidence was constructed, based on key meteorological factors, interactive effects, mosquito-vector factors, and other important factors.

Results

We found the minimum temperature suitable for dengue transmission was ≥18°C, and as 97.91% of cases occurred when the minimum temperature was above 18 °C, the data were used for model training and construction. Epidemics of dengue are related to mean temperature, maximum/minimum and mean atmospheric pressure, and mean relative humidity. Moreover, interactions occur between mean temperature, minimum atmospheric pressure, and mean relative humidity. Our weekly probit regression prediction model is 0.72. Prediction of dengue cases for the first 41 weeks of 2017 exhibited goodness of fit of 0.60.

Conclusion

Our model was accurate and timely, with consideration of interactive effects between meteorological factors.

]]>
<![CDATA[The validity, reliability and minimal clinically important difference of the patient specific functional scale in snake envenomation]]> https://www.researchpad.co/article/5c8823c4d5eed0c484638faf

Objective

Valid, reliable, and clinically relevant outcome measures are necessary in clinical studies of snake envenomation. The aim of this study was to evaluate the psychometric (validity and reliability) and clinimetric (minimal clinically important difference [MCID]) properties of the Patient-Specific Functional Scale (PSFS) in snakebite envenomation.

Methods

We performed a secondary analysis of two existing snakebite trials that measured clinical outcomes using the PSFS as well as other quality of life and functional assessments. Data were collected at 3, 7, 10, and 17 days. Reliability was determined using Cronbach’s alpha for internal consistency and the intraclass correlation coefficient (ICC) for temporal stability at 10 and 17 days. Validity was assessed using concurrent validity correlating with the other assessments. The MCID was evaluated using the following criteria: (1) the distribution of stable patients according to both standard error of measurement (SEM) and responsiveness techniques, and (2) anchor-based methods to compare between individuals and to detect discriminant ability of a positive change with a receiver operator characteristic (ROC) curve and optimal cutoff point.

Results

A total of 86 patients were evaluated in this study. The average PSFS scores were 5.37 (SD 3.23), 7.95 (SD 2.22), and 9.12 (SD 1.37) at 3, 7, and 10 days, respectively. Negligible floor effect was observed (maximum of 8% at 3 days); however, a ceiling effect was observed at 17 days (25%). The PSFS showed good reliability with an internal consistency of 0.91 (Cronbach’s alpha) (95% CI 0.88, 0.95) and a temporal stability of 0.83 (ICC) (95% CI 0.72, 0.89). The PSFS showed a strong positive correlation with quality of life and functional assessments. The MCID was approximately 1.0 for all methods.

Conclusions

With an MCID of approximately 1 point, the PSFS is a valid and reliable tool to assess quality of life and functionality in patients with snake envenomation.

]]>
<![CDATA[Cutaneous leishmaniasis and co-morbid major depressive disorder: A systematic review with burden estimates]]> https://www.researchpad.co/article/5c7d95d9d5eed0c484734dd0

Background

Major depressive disorder (MDD) associated with chronic neglected tropical diseases (NTDs) has been identified as a significant and overlooked contributor to overall disease burden. Cutaneous leishmaniasis (CL) is one of the most prevalent and stigmatising NTDs, with an incidence of around 1 million new cases of active CL infection annually. However, the characteristic residual scarring (inactive CL) following almost all cases of active CL has only recently been recognised as part of the CL disease spectrum due to its lasting psychosocial impact.

Methods and findings

We performed a multi-language systematic review of the psychosocial impact of active and inactive CL. We estimated inactive CL (iCL) prevalence for the first time using reported WHO active CL (aCL) incidence data that were adjusted for life expectancy and underreporting. We then quantified the disability (YLD) burden of co-morbid MDD in CL using MDD disability weights at three severity levels. Overall, we identified 29 studies of CL psychological impact from 5 WHO regions, representing 11 of the 50 highest burden countries for CL. We conservatively calculated the disability burden of co-morbid MDD in CL to be 1.9 million YLDs, which equalled the overall (DALY) disease burden (assuming no excess mortality in depressed CL patients). Thus, upon inclusion of co-morbid MDD alone in both active and inactive CL, the DALY burden was seven times higher than the latest 2016 Global Burden of Disease study estimates, which notably omitted both psychological impact and inactive CL.

Conclusions

Failure to include co-morbid MDD and the lasting sequelae of chronic NTDs, as exemplified by CL, leads to large underestimates of overall disease burden.

]]>
<![CDATA[Urea-mediated dissociation alleviate the false-positive Treponema pallidum-specific antibodies detected by ELISA]]> https://www.researchpad.co/article/5c8823e2d5eed0c484639234

The serological detection of antibodies to Treponema pallidum is essential to the diagnosis of syphilis. However, for the presence of cross-reaction, the specific antibody tests [e.g., enzyme-linked immunosorbent assay (ELISA)] always have false-positive results. In this study, we derived and validated the dissociation of urea in an attempt to alleviate the situation of false-positive antibodies to T. pallidum detected by ELISA. Six serum samples that were false-positive antibodies to T. pallidum detected by ELISA, and 16 control serum samples (8 sera positive for both specific IgG and IgM, and 8 IgG-positive and IgM-negative sera) were collected to select the appropriate dissociated concentration and time of urea. Our goal was to establish improved an ELISA method based on the original detection system of ELISA. The sensitivity of the improved ELISA was evaluated by 275 serum samples with class IgM-positive antibodies to T. pallidum. At 6 mol/L with 10 minutes dissociation of urea, 6 samples with false-positive antibodies to T. pallidum were converted to negative, and compared with true-positive antibodies to T. pallidum. The sensitivity of the improved ELISA was 100% by detecting the class IgM-positive antibodies to T. pallidum in sera of patients with syphilis. Considering the importance at the diagnosis of syphilis, antibodies to T. pallidum in serum samples should be retested by the improved ELISA method to avoid false-positive results.

]]>
<![CDATA[Projections of Ebola outbreak size and duration with and without vaccine use in Équateur, Democratic Republic of Congo, as of May 27, 2018]]> https://www.researchpad.co/article/5c8accd5d5eed0c4849900f7

As of May 27, 2018, 6 suspected, 13 probable and 35 confirmed cases of Ebola virus disease (EVD) had been reported in Équateur Province, Democratic Republic of Congo. We used reported case counts and time series from prior outbreaks to estimate the total outbreak size and duration with and without vaccine use. We modeled Ebola virus transmission using a stochastic branching process model that included reproduction numbers from past Ebola outbreaks and a particle filtering method to generate a probabilistic projection of the outbreak size and duration conditioned on its reported trajectory to date; modeled using high (62%), low (44%), and zero (0%) estimates of vaccination coverage (after deployment). Additionally, we used the time series for 18 prior Ebola outbreaks from 1976 to 2016 to parameterize the Thiel-Sen regression model predicting the outbreak size from the number of observed cases from April 4 to May 27. We used these techniques on probable and confirmed case counts with and without inclusion of suspected cases. Probabilistic projections were scored against the actual outbreak size of 54 EVD cases, using a log-likelihood score. With the stochastic model, using high, low, and zero estimates of vaccination coverage, the median outbreak sizes for probable and confirmed cases were 82 cases (95% prediction interval [PI]: 55, 156), 104 cases (95% PI: 58, 271), and 213 cases (95% PI: 64, 1450), respectively. With the Thiel-Sen regression model, the median outbreak size was estimated to be 65.0 probable and confirmed cases (95% PI: 48.8, 119.7). Among our three mathematical models, the stochastic model with suspected cases and high vaccine coverage predicted total outbreak sizes closest to the true outcome. Relatively simple mathematical models updated in real time may inform outbreak response teams with projections of total outbreak size and duration.

]]>
<![CDATA[Long term outcomes and prognostics of visceral leishmaniasis in HIV infected patients with use of pentamidine as secondary prophylaxis based on CD4 level: a prospective cohort study in Ethiopia]]> https://www.researchpad.co/article/5c784fedd5eed0c48400792b

Background

The long-term treatment outcome of visceral leishmaniasis (VL) patients with HIV co-infection is complicated by a high rate of relapse, especially when the CD4 count is low. Although use of secondary prophylaxis is recommended, it is not routinely practiced and data on its effectiveness and safety are limited.

Methods

A prospective cohort study was conducted in Northwest Ethiopia from August 2014 to August 2017 (NCT02011958). HIV-VL patients were followed for up to 12 months. Patients with CD4 cell counts below 200/μL at the end of VL treatment received pentamidine prophylaxis starting one month after parasitological cure, while those with CD4 count ≥200 cells/μL were followed without secondary prophylaxis. Compliance, safety and relapse-free survival, using Kaplan-Meier analysis methods to account for variable time at risk, were summarised. Risk factors for relapse or death were analysed.

Results

Fifty-four HIV patients were followed. The probability of relapse-free survival at one year was 50% (95% confidence interval [CI]: 35–63%): 53% (30–71%) in 22 patients with CD4 ≥200 cells/μL without pentamidine prophylaxis and 46% (26–63%) in 29 with CD4 <200 cells/μL who started pentamidine. Three patients with CD4 <200 cells/μL did not start pentamidine. Amongst those with CD4 ≥200 cells/μL, VL relapse was an independent risk factor for subsequent relapse or death (adjusted rate ratio: 5.42, 95% CI: 1.1–25.8). Except for one case of renal failure which was considered possibly related to pentamidine, there were no drug-related safety concerns.

Conclusion

The relapse-free survival rate for VL patients with HIV was low. Relapse-free survival of patients with CD4 count <200cells/μL given pentamidine secondary prophylaxis appeared to be comparable to patients with a CD4 count ≥200 cells/μL not given prophylaxis. Patients with relapsed VL are at higher risk for subsequent relapse and should be considered a priority for secondary prophylaxis, irrespective of their CD4 count.

]]>
<![CDATA[Behavior and abundance of Anopheles darlingi in communities living in the Colombian Amazon riverside]]> https://www.researchpad.co/article/5c8acc3ed5eed0c48498f2cc

In the past few years, relative frequencies of malaria parasite species in communities living in the Colombian Amazon riverside have changed, being Plasmodium vivax (61.4%) and Plasmodium malariae (43.8%) the most frequent. Given this epidemiological scenario, it is important to determine the species of anophelines involved in these parasites’ transmission. This study was carried out in June 2016 in two indigenous communities living close to the tributaries of the Amazon River using protected human bait. The results of this study showed a total abundance of 1,085 mosquitos, of which 99.2% corresponded to Anopheles darlingi. Additionally, only two anopheline species were found, showing low diversity in the study areas. Molecular confirmation of some individuals was then followed by evolutionary analysis by using the COI gene. Nested PCR was used for identifying the three Plasmodium species circulating in the study areas. Of the two species collected in this study, 21.0% of the An. darlingi mosquitoes were infected with P. malariae, 21.9% with P. vivax and 10.3% with Plasmodium falciparum. It exhibited exophilic and exophagic behavior in both study areas, having marked differences regarding its abundance in each community (Tipisca first sampling 49.4%, Tipisca second sampling 39.6% and Doce de Octubre 10.9%). Interestingly, An. mattogrossensis infected by P. vivax was found for the first time in Colombia (in 50% of the four females collected). Analysis of An. darlingi COI gene diversity indicated a single population maintaining a high gene flow between the study areas. The An. darlingi behavior pattern found in both communities represents a risk factor for the region’s inhabitants living/working near these sites. This highlights the need for vector control efforts such as the use of personal repellents and insecticides for use on cattle, which must be made available in order to reduce this Anopheline’s abundance.

]]>
<![CDATA[Implementation of a practical and effective pilot intervention against transmission of Taenia solium by pigs in the Banke district of Nepal]]> https://www.researchpad.co/article/5c7d95d7d5eed0c484734daa

Taenia solium is a zoonotic cestode parasite which causes human neurocysticercosis. Pigs transmit the parasite by acting as the intermediate host. An intervention was implemented to control transmission of T. solium by pigs in Dalit communities of Banke District, Nepal. Every 3 months, pigs were vaccinated with the TSOL18 recombinant vaccine (Cysvax, IIL, India)) and, at the same time, given an oral treatment with 30mg/kg oxfendazole (Paranthic 10% MCI, Morocco). The prevalence of porcine cysticercosis was determined in both an intervention area as well as a similar no intervention control area, among randomly selected, slaughter-age pigs. Post mortem assessments were undertaken both at the start and at the end of the intervention. Participants conducting the post mortem assessments were blinded as to the source of the animals being assessed. At the start of the intervention the prevalence of porcine cysticercosis was 23.6% and 34.5% in the control and intervention areas, respectively. Following the intervention, the prevalence of cysticercosis in pigs from the control area was 16.7% (no significant change), whereas no infection was detected after complete slicing of all muscle tissue and brain in animals from the intervention area (P = 0.004). These findings are discussed in relation to the feasibility and sustainability of T. solium control. The 3-monthly vaccination and drug treatment intervention in pigs used here is suggested as an effective and practical method for reducing T. solium transmission by pigs. The results suggest that applying the intervention over a period of years may ultimately reduce the number of tapeworm carriers and thereby the incidence of NCC.

]]>
<![CDATA[Comparative performance of four rapid Ebola antigen-detection lateral flow immunoassays during the 2014-2016 Ebola epidemic in West Africa]]> https://www.researchpad.co/article/5c8acc8bd5eed0c48498f9b7

Background

Without an effective vaccine, as was the case early in the 2014–2016 Ebola Outbreak in West Africa, disease control depends entirely on interrupting transmission through early disease detection and prompt patient isolation. Lateral Flow Immunoassays (LFI) are a potential supplement to centralized reference laboratory testing for the early diagnosis of Ebola Virus Disease (EVD).

The goal of this study was to assess the performance of commercially available simple and rapid antigen detection LFIs, submitted for review to the WHO via the Emergency Use Assessment and Listing procedure. The study was performed in an Ebola Treatment Centre laboratory involved in EVD testing in Sierra Leone.

In light of the current Ebola outbreak in May 2018 in the Democratic Republic of Congo, which highlights the lack of clarity in the global health community about appropriate Ebola diagnostics, our findings are increasingly critical.

Methods

A cross-sectional study was conducted to assess comparative performance of four LFIs for detecting EVD. LFIs were assessed against the same 328 plasma samples and 100 whole EDTA blood samples, using the altona RealStar Filovirus Screen real-time RT-PCR as the bench mark assay. The performance of the Public Health England (PHE) in-house Zaire ebolavirus-specific real time RT-PCR Trombley assay was concurrently assessed. Statistical analysis using generalized estimating equations was conducted to compare LFI performance.

Findings

Sensitivity and specificity varied between the LFIs, with specificity found to be significantly higher for whole EDTA blood samples compared to plasma samples in at least 2 LFIs (P≤0.003). Using the altona RT-PCR assay as the bench mark, sensitivities on plasma samples ranged from 79.53% (101/127, 95% CI: 71.46–86.17%) for the DEDIATEST EBOLA (SD Biosensor) to 98.43% (125/127, 95% CI: 94.43–99.81%) for the One step Ebola test (Intec). Specificities ranged from 80.20% (158/197, 95% CI: 74.07–88.60%) for plasma samples using the ReEBOV Antigen test Kit (Corgenix) to 100.00% (98/98, 95% CI: 96.31–100.00%) for whole blood samples using the DEDIATEST EBOLA (SD Biosensor) and SD Ebola Zaire Ag (SD Biosensor). Results also showed the Trombley RT-PCR assay had a lower limit of detection than the altona assay, with some LFIs having higher sensitivity than the altona assay when the Trombley assay was the bench mark.

Interpretation

All of the tested EVD LFIs may be considered suitable for use in an outbreak situation (i.e. rule out testing in communities), although they had variable performance characteristics, with none possessing both high sensitivity and specificity. The non-commercial Trombley Zaire ebolavirus RT-PCR assay warrants further investigation, as it appeared more sensitive than the current gold standard, the altona Filovirus Screen RT-PCR assay.

]]>