ResearchPad - infectious-disease-control https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Vaccination and monitoring strategies for epidemic prevention and detection in the Channel Island fox (<i>Urocyon littoralis</i>)]]> https://www.researchpad.co/article/elastic_article_15750 Disease transmission and epidemic prevention are top conservation concerns for wildlife managers, especially for small, isolated populations. Previous studies have shown that the course of an epidemic within a heterogeneous host population is strongly influenced by whether pathogens are introduced to regions of relatively high or low host densities. This raises the question of how disease monitoring and vaccination programs are influenced by spatial heterogeneity in host distributions. We addressed this question by modeling vaccination and monitoring strategies for the Channel Island fox (Urocyon littoralis), which has a history of substantial population decline due to introduced disease. We simulated various strategies to detect and prevent epidemics of rabies and canine distemper using a spatially explicit model, which was parameterized from field studies. Increasing sentinel monitoring frequency, and to a lesser degree, the number of monitored sentinels from 50 to 150 radio collared animals, reduced the time to epidemic detection and percentage of the fox population infected at the time of detection for both pathogens. Fox density at the location of pathogen introduction had little influence on the time to detection, but a large influence on how many foxes had become infected by the detection day, especially when sentinels were monitored relatively infrequently. The efficacy of different vaccination strategies was heavily influenced by local host density at the site of pathogen entry. Generally, creating a vaccine firewall far away from the site of pathogen entry was the least effective strategy. A firewall close to the site of pathogen entry was generally more effective than a random distribution of vaccinated animals when pathogens entered regions of high host density, but not when pathogens entered regions of low host density. These results highlight the importance of considering host densities at likely locations of pathogen invasion when designing disease management plans.

]]>
<![CDATA[Serum and cervicovaginal IgG immune responses against α7 and α9 HPV in non-vaccinated women at risk for cervical cancer: Implication for catch-up prophylactic HPV vaccination]]> https://www.researchpad.co/article/elastic_article_15726 Cervical cancer associated with high risk-human papillomavirus (HR-HPV) infection is becoming the one of the most common female cancer in many sub-Saharan African countries. First-generation immigrant African women living in Europe are at-risk for cervical cancer, in a context of social vulnerability, with frequent lack of cervical cancer screening and HPV vaccination.ObjectiveOur objective was to address immunologically the issue of catch-up prophylactic HPV vaccination in first-generation African immigrant women living in France.MethodsIgG immune responses and cross-reactivities to α7 (HPV-18, -45 and -68) and α9 (HPV-16, -31, -33, -35, -52 and -58) HPV types, including 7 HR-HPV targeted by the Gardasil-9® prophylactic vaccine, were evaluated in paired serum and cervicovaginal secretions (CVS) by HPV L1-virus-like particles-based ELISA. Genital HPV were detected by multiplex real time PCR (Seegene, Seoul, South Korea).ResultsFifty-one immigrant women (mean age, 41.7 years; 72.5% HIV-infected) were prospectively included. More than two-third (68.6%) of them carried genital HPV (group I) while 31.4% were negative (group II). The majority (90.2%) exhibited serum IgG to at least one α7/α9 HR-HPV. Serum HPV-specific IgG were more frequently detected in group I than group II (100% versus 68.7%; P = 0.002). The distribution of serum and genital HPV-specific IgG was similar, but mean number of IgG reactivities to α7/α9 HR-HPV was higher in serum than CVS (5.6 IgG per woman in serum versus 3.2 in CVS; P<0.001). Rates of IgG cross-reactivities against HPV different from detected cervicovaginal HPV were higher in serum and CVS in group I than group II. Finally, the majority of groups I and II women (68.6% and 68.7%, respectively) exhibited serum or cervicovaginal IgG to Gardasil-9® HR-HPV, with higher mean rates in group I than group II (6.1 Gardasil-9® HR-HPV per woman versus 1.4; P<0.01). One-third (31.2%) of group II women did not show any serum and genital HPV-specific IgG.ConclusionsAround two-third of first-generation African immigrant women living in France showed frequent ongoing genital HPV infection and high rates of circulating and genital IgG to α7/α9 HPV, generally cross-reacting, avoiding the possibility of catch-up vaccination. Nevertheless, about one-third of women had no evidence of previous HPV infection, or showed only low levels of genital and circulating HR-HPV-specific IgG and could therefore be eligible for catch-up vaccination. ]]> <![CDATA[Survival of glioblastoma treated with a moderately escalated radiation dose—Results of a retrospective analysis]]> https://www.researchpad.co/article/elastic_article_14700 Glioblastoma (GBM) has the highest fatality rate among primary malignant brain tumors and typically tends to recur locally just adjacent to the original tumor site following surgical resection and adjuvant radiotherapy. We conducted a study to evaluate the survival outcomes between a standard dose (≤ 60 Gy) and moderate radiation dose escalation (>60 Gy), and to identify prognostic factors for GBM. We retrospectively reviewed the medical records of primary GBM patients diagnosed between 2005 and 2016 in two referral hospitals in Taiwan. They were identified from the cancer registry database and followed up from the date of diagnosis to October 2018. The progression-free survival (PFS) and overall survival (OS) were compared between the two dose groups, and independent factors for survival were analyzed through Cox proportional hazard model. We also affirmed the results using Cox regression with least absolute shrinkage and selection operator (LASSO) approach. From our cancer registry database, 142 GBM patients were identified, and 84 of them fit the inclusion criteria. Of the 84 patients, 52 (62%) were males. The radiation dose ranged from 50.0 Gy to 66.6 Gy, but their treatment volumes were similar to the others. Fifteen (18%) patients received an escalated dose boost >60.0 Gy. The escalated group had a longer median PFS (15.4 vs. 7.9 months, p = 0.01 for log-rank test), and a longer median OS was also longer in the escalation group (33.8 vs. 12.5 months, p <0.001) than the reference group. Following a multivariate analysis, the escalated dose was identified as a significant predictor for good prognosis (PFS: hazard ratio [HR] = 0.48, 95% confidence interval [95%CI]: 0.23–0.98; OS: HR = 0.40, 95%CI: 0.21–0.78). Using the LASSO approach, we found age > 70 (HR = 1.55), diagnosis after 2010 (HR = 1.42), and a larger radiation volume (≥ 250ml; HR = 0.81) were predictors of PFS. The escalated dose (HR = 0.47) and a larger radiation volume (HR = 0.76) were identified as predictors for better OS. Following detailed statistical analysis, a moderate radiation dose escalation (> 60 Gy) was found as an independent factor affecting OS in GBM patients. In conclusion, a moderate radiation dose escalation (> 60 Gy) was an independent predictor for longer OS in GBM patients. However, prospective studies including more patients with more information, such as molecular markers and completeness of resection, are needed to confirm our findings.

]]>
<![CDATA[Kala-azar elimination in a highly-endemic district of Bihar, India: A success story]]> https://www.researchpad.co/article/elastic_article_14634 The World Health Organization (WHO) has set a target to eliminate visceral leishmaniasis (VL), commonly known as “Kala-azar,” as a public health problem in India by 2020. The elimination target is defined as achieving less than 1 case per 10,000 people at the block level. Although India has made substantial progress in the elimination of the disease since 2012, VL remains a stable public health problem in four middle-eastern states including Bihar. Bihar contributes >61% of the total Indian cases annually, and a few districts of the state have reported more than 600 cases annually. In this study, the results indicate that an intensive integrated VL control strategy including epidemiological analysis based on a geographical information system (GIS), hot-spot mapping, active case detection, vector control using the indoor residual spraying (IRS) of chemical insecticides, awareness campaigns, human resource development, the close monitoring of control activities, and active epidemiological surveillance and entomological monitoring can achieve the elimination target in the highly endemic region of Bihar. The elimination of VL from highly endemic zones is urgently required to control any new outbreak. Therefore, the implementation of the Vaishali VL control strategy is strongly recommended in all highly endemic districts of Bihar, India.

]]>
<![CDATA[The psychological distress and coping styles in the early stages of the 2019 coronavirus disease (COVID-19) epidemic in the general mainland Chinese population: A web-based survey]]> https://www.researchpad.co/article/elastic_article_14631 As the epidemic outbreak of 2019 coronavirus disease (COVID-19), general population may experience psychological distress. Evidence has suggested that negative coping styles may be related to subsequent mental illness. Therefore, we investigate the general population’s psychological distress and coping styles in the early stages of the COVID-19 outbreak. A cross-sectional battery of surveys was conducted from February 1–4, 2020. The Kessler 6 psychological distress scale, the simplified coping style questionnaire and a general information questionnaire were administered on-line to a convenience sample of 1599 in China. A multiple linear regression analysis was performed to identify the influence factors of psychological distress. General population’s psychological distress were significant differences based on age, marriage, epidemic contact characteristics, concern with media reports, and perceived impacts of the epidemic outbreak (all p <0.001) except gender (p = 0.316). The population with younger age (F = 102.04), unmarried (t = 15.28), with history of visiting Wuhan in the past month (t = -40.86), with history of epidemics occurring in the community (t = -10.25), more concern with media reports (F = 21.84), perceived more impacts of the epidemic outbreak (changes over living situations, F = 331.71; emotional control, F = 1863.07; epidemic-related dreams, F = 1642.78) and negative coping style (t = 37.41) had higher level of psychological distress. Multivariate analysis found that marriage, epidemic contact characteristics, perceived impacts of the epidemic and coping style were the influence factors of psychological distress (all p <0.001). Epidemic of COVID-19 caused high level of psychological distress. The general mainland Chinese population with unmarried, history of visiting Wuhan in the past month, perceived more impacts of the epidemic and negative coping style had higher level of psychological distress in the early stages of COVID-19 epidemic. Psychological interventions should be implemented early, especially for those general population with such characteristics.

]]>
<![CDATA[Preliminary study: Health and performance assessment in broiler chicks following application of six different hatching egg disinfection protocols]]> https://www.researchpad.co/article/elastic_article_14617 As part of a Germany-wide project that evaluates strategies for the reduction of multi-resistant bacteria along the poultry production chain, the impact of different hatching egg disinfectants on hatchability and health of the broiler chicks was evaluated. Animal trials were conducted with extended-spectrum beta-lactamase- (ESBL) producing Escherichia (E.) coli contaminated hatching eggs and six disinfection protocols that used formaldehyde, hydrogen peroxide, low-energy electron irradiation, peracetic acid and an essential oil preparation. Each protocol was tested on a group of 50 chicks. Equally sized positive and negative control groups were carried along for each trial. Hatchability, mortality and body weight were recorded as performance parameters. During necropsy of half of the animals in each group on day 7 and 14 respectively, macroscopic abnormalities, body weight, weights of liver and gut convolute were recorded and a range of tissue samples for histological examination were collected as part of the health assessment. A decrease in hatchability was recorded for spray application of essential oils. Body weight development was overall comparable, in several groups even superior, to the Ross308 performance objectives, but a reduced performance was seen in the hydrogen peroxide group. Histologically, lymphoid follicles were regularly seen in all sampled organs and no consistent differences were observed between contaminated and non-contaminated groups. Significances were infrequently and inconsistently seen. In conclusion, remarkable findings were a decrease in hatchability caused by the essential oils spray application and a reduced body weight development in the hydrogen peroxide group. Therefore, the essential oils preparation as spray application was deemed inappropriate in practice, while the application of hydrogen peroxide was considered in need of further research. The other trial results indicate that the tested hatching egg disinfectants present a possible alternative to formaldehyde.

]]>
<![CDATA[Herd immunity and a vaccination game: An experimental study]]> https://www.researchpad.co/article/elastic_article_14598 Would the affected communities voluntarily obtain herd immunity if a cure for COVID-19 was available? This paper experimentally investigates people’s vaccination choices in the context of a nonlinear public good game. A “vaccination game” is defined in which costly commitments (vaccination) are required of a fraction of the population to reach the critical level needed for herd immunity, without which defectors are punished by the natural contagion of epidemics. Our experimental implementation of a vaccination game in a controlled laboratory setting reveals that endogenous epidemic punishment is a credible threat, resulting in voluntary vaccination to obtain herd immunity, for which the orthodox principle of positive externalities fails to account. The concave nature of the infection probability plays a key role in facilitating the elimination of an epidemic.

]]>
<![CDATA[Novel malaria antigen <i>Plasmodium yoelii</i> E140 induces antibody-mediated sterile protection in mice against malaria challenge]]> https://www.researchpad.co/article/elastic_article_14592 Only a small fraction of the antigens expressed by malaria parasites have been evaluated as vaccine candidates. A successful malaria subunit vaccine will likely require multiple antigenic targets to achieve broad protection with high protective efficacy. Here we describe protective efficacy of a novel antigen, Plasmodium yoelii (Py) E140 (PyE140), evaluated against P. yoelii challenge of mice. Vaccines targeting PyE140 reproducibly induced up to 100% sterile protection in both inbred and outbred murine challenge models. Although PyE140 immunization induced high frequency and multifunctional CD8+ T cell responses, as well as CD4+ T cell responses, protection was mediated by PyE140 antibodies acting against blood stage parasites. Protection in mice was long-lasting with up to 100% sterile protection at twelve weeks post-immunization and durable high titer anti-PyE140 antibodies. The E140 antigen is expressed in all Plasmodium species, is highly conserved in both P. falciparum lab-adapted strains and endemic circulating parasites, and is thus a promising lead vaccine candidate for future evaluation against human malaria parasite species.

]]>
<![CDATA[Inference on dengue epidemics with Bayesian regime switching models]]> https://www.researchpad.co/article/elastic_article_14505 Dengue, a mosquito-borne infectious disease caused by the dengue viruses, is present in many parts of the tropical and subtropical regions of the world. All four serotypes of dengue viruses are endemic in Singapore, an equatorial city-state. Frequent outbreaks occur, sometimes leading to national epidemics. However, few studies have attempted to characterize breakpoints which precede large rises in dengue case counts. In this paper, Bayesian regime switching (BRS) models were employed to infer epidemic and endemic regimes of dengue transmissions, each containing regime specific processes which drive the growth and decline of dengue cases, estimated using a custom built multi-move Gibbs sampling algorithm. Assessments against various baseline showed that BRS performs better in characterizing dengue transmissions. The dengue regimes estimated by BRS are characterized by their persistent nature. Next, climate analysis showed no short nor long term associations between classified regimes with climate. Lastly, fitting BRS to simulated disease data generated from a mechanistic model, we showed links between disease infectivity and regimes classified using BRS. The model proposed could be applied to other localities and diseases under minimal data requirements where transmission counts over time are collected.

]]>
<![CDATA[Evidence of recombination of vaccine strains of lumpy skin disease virus with field strains, causing disease]]> https://www.researchpad.co/article/elastic_article_14489 Vaccination against lumpy skin disease (LSD) is crucial for maintaining the health of animals and the economic sustainability of farming. Either homologous vaccines consisting of live attenuated LSD virus (LSDV) or heterologous vaccines consisting of live attenuated sheeppox or goatpox virus (SPPV/GPPV) can be used for control of LSDV. Although SPPV/GTPV-based vaccines exhibit slightly lower efficacy than live attenuated LSDV vaccines, they do not cause vaccine-induced viremia, fever, and clinical symptoms of the disease following vaccination, caused by the replication capacity of live attenuated LSDVs. Recombination of capripoxviruses in the field was a long-standing hypothesis until a naturally occurring recombinant LSDV vaccine isolate was detected in Russia, where the sheeppox vaccine alone is used. This occurred after the initiation of vaccination campaigns using LSDV vaccines in the neighboring countries in 2017, when the first cases of presumed vaccine-like isolate circulation were documented with concurrent detection of a recombinant vaccine isolate in the field. The follow-up findings presented herein show that during the period from 2015 to 2018, the molecular epidemiology of LSDV in Russia split into two independent waves. The 2015–2016 epidemic was attributable to the field isolate. Whereas the 2017 epidemic and, in particular, the 2018 epidemic represented novel disease importations that were not genetically linked to the 2015–2016 field-type incursions. This demonstrated a new emergence rather than the continuation of the field-type epidemic. Since recombinant vaccine-like LSDV isolates appear to have entrenched across the country’s border, the policy of using certain live vaccines requires revision in the context of the biosafety threat it presents.

]]>
<![CDATA[Low response in eliciting neuraminidase inhibition activity of sera among recipients of a split, monovalent pandemic influenza vaccine during the 2009 pandemic]]> https://www.researchpad.co/article/elastic_article_14470 Antibodies against influenza virus neuraminidase (NA) protein prevent releasing of the virus from host cells and spreading of infection foci and are considered the ‘second line of defence’ against influenza. Haemagglutinin inhibition antibody-low responders (HI-LRs) are present among influenza split vaccine recipients. The NA inhibition (NAI) antibody response in vaccinees is worth exploring, especially those in the HI-LRs population. We collected pre- and post-vaccination sera from 61 recipients of an inactivated, monovalent, split vaccine against A/H1N1pdm09 and acute and convalescent sera from 49 unvaccinated patients naturally infected with the A/H1N1pdm09 virus during the 2009 influenza pandemic. All samples were subjected to haemagglutinin inhibition (HI), NAI and neutralisation assays. Most paired sera from naturally infected patients exhibited marked elevation in the NAI activity, and seroconversion rates (SCR) among HI-LRs and HI-responders (HI-Rs) were 60% and 87%, respectively; however, those from vaccinees displayed low increase in the NAI activity, and the SCR among HI-LRs and HI-Rs were 0% and 12%, respectively. In both HI-LRs and HI-Rs, vaccination with the inactivated, monovalent, split vaccine failed to elicit the NAI activity efficiently in the sera of the naive population, compared with the natural infection. Hence, the improvement of influenza vaccines is warranted to elicit not only HI but also NAI antibodies.

]]>
<![CDATA[Not sick enough to worry? "Influenza-like" symptoms and work-related behavior among healthcare workers and other professionals: Results of a global survey]]> https://www.researchpad.co/article/elastic_article_13852 Healthcare workers (HCWs) and non-HCWs may contribute to the transmission of influenza-like illness (ILI) to colleagues and susceptible patients by working while sick (presenteeism). The present study aimed to explore the views and behavior of HCWs and non-HCWs towards the phenomenon of working while experiencing ILI.MethodsThe study was a cross-sectional online survey conducted between October 2018 and January 2019 to explore sickness presenteeism and the behaviour of HCWs and non-HCWs when experiencing ILI. The survey questionnaire was distributed to the members and international networks of the International Society of Antimicrobial Chemotherapy (ISAC) Infection Prevention and Control (IPC) Working Group, as well as via social media platforms, including LinkedIn, Twitter and IPC Blog.ResultsIn total, 533 respondents from 49 countries participated (Europe 69.2%, Asia-Pacific 19.1%, the Americas 10.9%, and Africa 0.8%) representing 249 HCWs (46.7%) and 284 non-HCWs (53.2%). Overall, 312 (58.5%; 95% confidence interval [CI], 56.2–64.6) would continue to work when sick with ILI, with no variation between the two categories. Sixty-seven (26.9%) HCWs and forty-six (16.2%) non-HCWs would work with fever alone (p<0 .01) Most HCWs (89.2–99.2%) and non-HCWs (80%-96.5%) would work with “minor” ILI symptoms, such as sore throat, sinus cold, fatigue, sneezing, runny nose, mild cough and reduced appetite.ConclusionA future strategy to successfully prevent the transmission of ILI in healthcare settings should address sick-leave policy management, in addition to encouraging the uptake of influenza vaccine. ]]> <![CDATA[A systematic review of alternative surveillance approaches for lymphatic filariasis in low prevalence settings: Implications for post-validation settings]]> https://www.researchpad.co/article/elastic_article_13802 Lymphatic filariasis (LF) is a mosquito-borne disease, which can result in complications including swelling affecting the limbs (lymphoedema) or scrotum (hydrocele). LF can be eliminated by mass drug administration (MDA) which involves whole communities taking drug treatment at regular intervals. After MDA programmes, country programmes conduct the Transmission Assessment Survey (TAS), which tests school children for LF. It is important to continue testing for LF after elimination because there can be a 10-year period between becoming infected and developing symptoms, but it is thought that the use of TAS in such settings is likely to be too expensive and also not sensitive enough to detect low-level infections. Our study assesses the results from 44 studies in areas of low LF prevalence that have investigated methods of surveillance for LF which differ from the standardised TAS approach. These include both human and mosquito studies. Results show that there is currently no standardised approach to testing, but that surveillance can be made more sensitive through the use of new diagnostic tests, such as antibody testing, and also by targeting higher risk populations. However, further research is needed to understand whether these approaches work in a range of settings and whether they are affordable on the ground.

]]>
<![CDATA[A biological control model to manage the vector and the infection of <i>Xylella fastidiosa</i> on olive trees]]> https://www.researchpad.co/article/elastic_article_11237 Xylella fastidiosa pauca ST53 is the bacterium responsible for the Olive Quick Decline Syndrome that has killed millions of olive trees in Southern Italy. A recent work demonstrates that a rational integration of vector and transmission control measures, into a strategy based on chemical and physical control means, can manage Xylella fastidiosa invasion and impact below an acceptable economic threshold. In the present study, we propose a biological alternative to the chemical control action, which involves the predetermined use of an available natural enemy of Philaenus spumarius, i.e., Zelus renardii, for adult vector population and infection biocontrol. The paper combines two different approaches: a laboratory experiment to test the predation dynamics of Zelus renardii on Philaenus spumarius and its attitude as candidate for an inundation strategy; a simulated experiment of inundation, to preliminary test the efficacy of such strategy, before eventually proceeding to an in-field experimentation. With this double-fold approach we show that an inundation strategy with Zelus renardii has the potential to furnish an efficient and “green” solution to Xylella fastidiosa invasion, with a reduction of the pathogen incidence below 10%. The biocontrol model presented here could be promising for containing the impact and spread of Xylella fastidiosa, after an in-field validation of the inundation technique. Saving the fruit orchard, the production and the industry in susceptible areas could thus become an attainable goal, within comfortable parameters for sustainability, environmental safety, and effective plant health protection in organic orchard management.

]]>
<![CDATA[Identification of cholera hotspots in Zambia: A spatiotemporal analysis of cholera data from 2008 to 2017]]> https://www.researchpad.co/article/Nb4ea4681-5c5d-42bd-a1ce-642b56a34f03

The global burden of cholera is increasing, with the majority (60%) of the cases occurring in sub-Saharan Africa. In Zambia, widespread cholera outbreaks have occurred since 1977, predominantly in the capital city of Lusaka. During both the 2016 and 2018 outbreaks, the Ministry of Health implemented cholera vaccination in addition to other preventative and control measures, to stop the spread and control the outbreak. Given the limitations in vaccine availability and the logistical support required for vaccination, oral cholera vaccine (OCV) is now recommended for use in the high risk areas (“hotspots”) for cholera. Hence, the aim of this study was to identify areas with an increased risk of cholera in Zambia. Retrospective cholera case data from 2008 to 2017 was obtained from the Ministry of Health, Department of Public Health and Disease Surveillance. The Zambian Central Statistical Office provided district-level population data, socioeconomic and water, sanitation and hygiene (WaSH) indicators. To identify districts at high risk, we performed a discrete Poisson-based space-time scan statistic to account for variations in cholera risk across both space and time over a 10-year study period. A zero-inflated negative binomial regression model was employed to identify the district level risk factors for cholera. The risk map was generated by classifying the relative risk of cholera in each district, as obtained from the space-scan test statistic. In total, 34,950 cases of cholera were reported in Zambia between 2008 and 2017. Cholera cases varied spatially by year. During the study period, Lusaka District had the highest burden of cholera, with 29,080 reported cases. The space-time scan statistic identified 16 districts to be at a significantly higher risk of having cholera. The relative risk of having cholera in these districts was significantly higher and ranged from 1.25 to 78.87 times higher when compared to elsewhere in the country. Proximity to waterbodies was the only factor associated with the increased risk for cholera (P<0.05). This study provides a basis for the cholera elimination program in Zambia. Outside Lusaka, the majority of high risk districts identified were near the border with the DRC, Tanzania, Mozambique, and Zimbabwe. This suggests that cholera in Zambia may be linked to movement of people from neighboring areas of cholera endemicity. A collaborative intervention program implemented in concert with neighboring countries could be an effective strategy for elimination of cholera in Zambia, while also reducing rates at a regional level.

]]>
<![CDATA[Feasibility of establishing an HIV vaccine preparedness cohort in a population of the Uganda Police Force: Lessons learnt from a prospective study]]> https://www.researchpad.co/article/Ne890bb8a-5661-4c39-82f7-6f40a2e69675

Background

Members of uniformed armed forces are considered to be at high risk for HIV infection and have been proposed as suitable candidates for participation in HIV intervention studies. We report on the feasibility of recruitment and follow up of individuals from the community of the Uganda Police Force (UPF) for an HIV vaccine preparedness study.

Methods

HIV-negative volunteers aged 18–49 years, were identified from UPF facilities situated in Kampala and Wakiso districts through community HIV counselling and testing. Potential volunteers were referred to the study clinic for screening, enrolment and quarterly visits for one year. HIV incidence, retention rates were estimated and expressed as cases per 100 person years of observation (PYO). Rate ratios were used to determine factors associated with retention using Poisson regression models.

Results

We screened 560 to enroll 500 volunteers between November 2015 and May 2016. One HIV seroconversion occurred among 431 PYO, for an incidence rate of 0.23/100 PYO (95% confidence interval [CI]: 0.03–1.64). Overall, retention rate was 87% at one year, and this was independently associated with residence duration (compared to <1 year, 1 to 5 years adjusted rate ratio (aRR) = 1.19, 95%CI: 1.00–1.44); and >5 years aRR = 1.34, 95%CI: 0.95–1.37); absence of genital discharge in the last 3 months (aRR = 1.97, 95% CI: 1.38–2.83, absence of genital ulcers (aRR = 1.90, 95%CI: 1.26–2.87, reporting of new sexual partner in the last month (aRR = 0.57, 95%CI: 0.45–0.71, being away from home for more than two nights (aRR = 1.27, 95%CI: 1.04–1.56, compared to those who had not travelled) and absence of knowledge on HIV prevention (aRR = 2.67, 95%CI: 1.62–4.39).

Conclusions

While our study demonstrates the feasibility of recruiting and retaining individuals from the UPF for HIV research, we did observe lower than anticipated HIV incidence, perhaps because individuals at lower risk of HIV infection may have been the first to come forward to participate or participants followed HIV risk reduction measures. Our findings suggest lessons for recruitment of populations at high risk of HIV infection.

]]>
<![CDATA[Optimization of tissue sampling for Borrelia burgdorferi in white-footed mice (Peromyscus leucopus)]]> https://www.researchpad.co/article/Nff220985-8630-4822-8507-6b577103a931

Peromyscus leucopus (the white-footed mouse) is a known reservoir of the Lyme disease spirochete Borrelia burgdorferi. Sampling of white-footed mice allows for year-round B. burgdorferi surveillance as well as opportunities to establish the diversity of the different variants in a geographic region. This study explores the prevalence of B. burgdorferi infections in the tissues of white-footed mice, investigates the correlations between B. burgdorferi infected tissues, and determines the optimum field methods for surveillance of B. burgdorferi in P. leucopus. A total of 90 mice and 573 tissues (spleen, liver, ear, tongue, tail, heart, and kidney) were screened via nested PCR for B. burgdorferi infections. A large number of infections were found in the 90 mice as well as multiple infections within individual mice. Infections in a single mouse tissue (spleen, liver, ear, tongue and tail) were predictive of concurrent infection in other tissues of the same mouse at a statistically significant level. Ear tissue accounted for 68.4% of detected infections, which increased to 78.9% of the infected mice with the inclusion of tail samples. The use of ear punch or tail snip samples (used individually or in tandem) have multiple advantages over current Lyme disease ecological studies and surveillance methodologies, including lower associated costs, minimization of delays, year-round B. burgdorferi testing opportunities, as well as longitudinal monitoring of B. burgdorferi in defined geographic regions. In the absence of an effective vaccine, personal prevention measures are currently the most effective way to reduce Lyme disease transmission to humans. Thus, the identification and monitoring of environmental reservoirs to inform at-risk populations remains a priority. The sampling methods proposed in this study provide a reasonable estimate of B. burgdorferi in white-footed mice in a timely and cost-effective manner.

]]>
<![CDATA[Lean back and wait for the alarm? Testing an automated alarm system for nosocomial outbreaks to provide support for infection control professionals]]> https://www.researchpad.co/article/N4571fdc0-2a2e-4467-acc9-eeadc2652757

Introduction

Outbreaks of communicable diseases in hospitals need to be quickly detected in order to enable immediate control. The increasing digitalization of hospital data processing offers potential solutions for automated outbreak detection systems (AODS). Our goal was to assess a newly developed AODS.

Methods

Our AODS was based on the diagnostic results of routine clinical microbiological examinations. The system prospectively counted detections per bacterial pathogen over time for the years 2016 and 2017. The baseline data covers data from 2013–2015. The comparative analysis was based on six different mathematical algorithms (normal/Poisson and score prediction intervals, the early aberration reporting system, negative binomial CUSUMs, and the Farrington algorithm). The clusters automatically detected were then compared with the results of our manual outbreak detection system.

Results

During the analysis period, 14 different hospital outbreaks were detected as a result of conventional manual outbreak detection. Based on the pathogens’ overall incidence, outbreaks were divided into two categories: outbreaks with rarely detected pathogens (sporadic) and outbreaks with often detected pathogens (endemic). For outbreaks with sporadic pathogens, the detection rate of our AODS ranged from 83% to 100%. Every algorithm detected 6 of 7 outbreaks with a sporadic pathogen. The AODS identified outbreaks with an endemic pathogen were at a detection rate of 33% to 100%. For endemic pathogens, the results varied based on the epidemiological characteristics of each outbreak and pathogen.

Conclusion

AODS for hospitals based on routine microbiological data is feasible and can provide relevant benefits for infection control teams. It offers in-time automated notification of suspected pathogen clusters especially for sporadically occurring pathogens. However, outbreaks of endemically detected pathogens need further individual pathogen-specific and setting-specific adjustments.

]]>
<![CDATA[A mathematical model for assessing the effectiveness of controlling relapse in Plasmodium vivax malaria endemic in the Republic of Korea]]> https://www.researchpad.co/article/Nf3d8dda1-10e2-4286-9776-07d534017a03

Malaria has persisted as an endemic near the Demilitarized Zone in the Republic of Korea since the re-emergence of Plasmodium vivax malaria in 1993. The number of patients affected by malaria has increased recently despite many controls tools, one of the reasons behind which is the relapse of malaria via liver hypnozoites. Tafenoquine, a new drug approved by the United States Food and Drug Administration in 2018, is expected to reduce the rate of relapse of malaria hypnozoites and thereby decrease the prevalence of malaria among the population. In this work, we have developed a new transmission model for Plasmodium vivax that takes into account a more realistic intrinsic distribution from existing literature to quantify the current values of relapse parameters and to evaluate the effectiveness of the anti-relapse therapy. The model is especially suitable for estimating parameters near the Demilitarized Zone in Korea, in which the disease follows a distinguishable seasonality. Results were shown that radical cure could significantly reduce the prevalence level of malaria. However, eradication would still take a long time (over 10 years) even if the high-level treatment were to persist. In addition, considering that the vector’s behavior is manipulated by the malaria parasite, relapse repression through vector control at the current level may result in a negative effect in containing the disease. We conclude that the use of effective drugs should be considered together with the increased level of the vector control to reduce malaria prevalence.

]]>
<![CDATA[Diversity of A(H5N1) clade 2.3.2.1c avian influenza viruses with evidence of reassortment in Cambodia, 2014-2016]]> https://www.researchpad.co/article/Nf51e501a-7d3b-493b-bbd9-bf4dbc27c932

In Cambodia, highly pathogenic avian influenza A(H5N1) subtype viruses circulate endemically causing poultry outbreaks and zoonotic human cases. To investigate the genomic diversity and development of endemicity of the predominantly circulating clade 2.3.2.1c A(H5N1) viruses, we characterised 68 AIVs detected in poultry, the environment and from a single human A(H5N1) case from January 2014 to December 2016. Full genomes were generated for 42 A(H5N1) viruses. Phylogenetic analysis shows that five clade 2.3.2.1c genotypes, designated KH1 to KH5, were circulating in Cambodia during this period. The genotypes arose through multiple reassortment events with the neuraminidase (NA) and internal genes belonging to H5N1 clade 2.3.2.1a, clade 2.3.2.1b or A(H9N2) lineages. Phylogenies suggest that the Cambodian AIVs were derived from viruses circulating between Cambodian and Vietnamese poultry. Molecular analyses show that these viruses contained the hemagglutinin (HA) gene substitutions D94N, S133A, S155N, T156A, T188I and K189R known to increase binding to the human-type α2,6-linked sialic acid receptors. Two A(H5N1) viruses displayed the M2 gene S31N or A30T substitutions indicative of adamantane resistance, however, susceptibility testing towards neuraminidase inhibitors (oseltamivir, zanamivir, lananmivir and peramivir) of a subset of thirty clade 2.3.2.1c viruses showed susceptibility to all four drugs. This study shows that A(H5N1) viruses continue to reassort with other A(H5N1) and A(H9N2) viruses that are endemic in the region, highlighting the risk of introduction and emergence of novel A(H5N1) genotypes in Cambodia.

]]>