ResearchPad - signs-and-symptoms https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[Betanin purification from red beetroots and evaluation of its anti-oxidant and anti-inflammatory activity on LPS-activated microglial cells]]> https://www.researchpad.co/article/elastic_article_13861 Microglial activation can release free radicals and various pro-inflammatory cytokines, which implicates the progress of a neurodegenerative disease. Therefore suppression of microglial activation can be an appropriate strategy for combating neurodegenerative diseases. Betanin is a red food dye that acts as free radical scavenger and can be a promising candidate for this purpose. In this study, purification of betanin from red beetroots was carried out by normal phase colum chromatography, yielding 500 mg of betanin from 100 g of red beetroot. The purified betanin was evaluated by TLC, UV-visible, HPLC, ESI-MASS, FT-IR spectroscopy. Investigation on the inhibitory effect of betanin on activated microglia was performed using primary microglial culture. The results showed that betanin significantly inhibited lipopolysaccharide induced microglial function including the production of nitric oxide free radicals, reactive oxygen species, tumor necrosis factor-alpha (TNF-α), interleukin-6 (IL-6) and interleukin-1 beta (IL-1β). Moreover, betanin modulated mitochondrial membrane potential, lysosomal membrane permeabilization and adenosine triphosphate. We further investigated the interaction of betanin with TNF-α, IL-6 and Nitric oxide synthase (iNOS or NOS2) using in silico molecular docking analysis. The docking results demonstrated that betanin have significant negative binding energy against active sites of TNF-α, IL-6 and iNOS.

]]>
<![CDATA[A prospective study of bloodstream infections among febrile adolescents and adults attending Yangon General Hospital, Yangon, Myanmar]]> https://www.researchpad.co/article/elastic_article_13833 Bloodstream infection (BSI) is common among persons seeking healthcare for severe febrile illness in low-and middle-income countries. Data on community-onset BSI are few for some countries in Asia, including Myanmar. Such data are needed to inform empiric antimicrobial treatment of patients and to monitor and control antimicrobial resistance. We performed a one year, prospective study collecting information and blood cultures from patients presenting with fever at a tertiary referral hospital in Yangon, Myanmar. We found that almost 10% of participants had a bloodstream infection, and that Salmonella enterica serovars Typhi and Paratyphi A were the most common pathogens. Typhoidal Salmonella were universally resistant to ciprofloxacin. More than half of Escherichia coli and Klebsiella pneumoniae were resistant to extended-spectrum cephalosporins and resistance to carbapenems was also identified in some isolates. We show that typhoid and paratyphoid fever are common, and fluoroquinolone resistance is widespread. Extended-spectrum cephalosporin resistance is common in E. coli and K. pneumoniae and carbapenem resistance is present. Our findings inform empiric antimicrobial management of severe febrile illness, underscore the value of routine use of blood cultures, indicate that measures to prevent and control enteric fever are warranted, and suggest a need to monitor and mitigate antimicrobial resistance among community-acquired pathogens.

]]>
<![CDATA[Retrospectively ECG-gated helical vs. non-ECG-synchronized high-pitch CTA of the aortic root for TAVI planning]]> https://www.researchpad.co/article/elastic_article_13825 Multidetector computed tomography (MDCT) plays a key role in patient assessment prior to transcatheter aortic valve implantation (TAVI). However, to date no consensus has been established on what is the optimal pre-procedural imaging protocol. Variability in pre-TAVI acquisition protocols may lead to discrepancies in aortic annulus measurements and may potentially influence prosthesis size selection.PurposeThe current study evaluates the magnitude of differences in aortic annulus measurements using max-systolic, end-diastolic, and non-ECG-synchronized imaging, as well as the impact of method on prosthesis size selection.Material and methodsFifty consecutive TAVI-candidates, who underwent retrospectively-ECG-gated CT angiography (CTA) of the aortic root, directly followed by non-ECG-synchronized high-pitch CT of the entire aorta, were retrospectively included. Aortic root dimensions were assessed at each 10% increment of the R-R interval (0–100%) and on the non-ECG-synchronized scan. Dimensional changes within the cardiac cycle were evaluated using a 1-way repeated ANOVA. Agreement in measurements between max-systole, end-diastole and non-ECG-synchronized scans was assessed with Bland-Altman analysis.ResultsMaximal dimensions of the aortic root structures and minimum annulus-coronary ostia distances were measured during systole. Max-systolic measurements were significantly and substantially larger than end-diastolic (p<0.001) and non-ECG-synchronized measurements (p<0.001). Due to these discrepancies, the three methods resulted in the same prosthesis size selection in only 48–62% of patients.ConclusionsThe systematic differences between max-systolic, end-diastolic and non-ECG-synchronized measurements for relevant aortic annular dimensions are both statistically significant and clinically relevant. Imaging strategy impacts prosthesis size selection in nearly half the TAVI-candidates. End-diastolic and non-ECG-synchronized imaging does not provide optimal information for prosthesis size selection. Systolic image acquisition is necessary for assessment of maximal annular dimensions and minimum annulus-coronary ostia distances. ]]> <![CDATA[Not sick enough to worry? "Influenza-like" symptoms and work-related behavior among healthcare workers and other professionals: Results of a global survey]]> https://www.researchpad.co/article/elastic_article_13852 Healthcare workers (HCWs) and non-HCWs may contribute to the transmission of influenza-like illness (ILI) to colleagues and susceptible patients by working while sick (presenteeism). The present study aimed to explore the views and behavior of HCWs and non-HCWs towards the phenomenon of working while experiencing ILI.MethodsThe study was a cross-sectional online survey conducted between October 2018 and January 2019 to explore sickness presenteeism and the behaviour of HCWs and non-HCWs when experiencing ILI. The survey questionnaire was distributed to the members and international networks of the International Society of Antimicrobial Chemotherapy (ISAC) Infection Prevention and Control (IPC) Working Group, as well as via social media platforms, including LinkedIn, Twitter and IPC Blog.ResultsIn total, 533 respondents from 49 countries participated (Europe 69.2%, Asia-Pacific 19.1%, the Americas 10.9%, and Africa 0.8%) representing 249 HCWs (46.7%) and 284 non-HCWs (53.2%). Overall, 312 (58.5%; 95% confidence interval [CI], 56.2–64.6) would continue to work when sick with ILI, with no variation between the two categories. Sixty-seven (26.9%) HCWs and forty-six (16.2%) non-HCWs would work with fever alone (p<0 .01) Most HCWs (89.2–99.2%) and non-HCWs (80%-96.5%) would work with “minor” ILI symptoms, such as sore throat, sinus cold, fatigue, sneezing, runny nose, mild cough and reduced appetite.ConclusionA future strategy to successfully prevent the transmission of ILI in healthcare settings should address sick-leave policy management, in addition to encouraging the uptake of influenza vaccine. ]]> <![CDATA[Is transjugular insertion of a temporary pacemaker a safe and effective approach?]]> https://www.researchpad.co/article/elastic_article_13814 Temporary pacemakers (TPMs) are usually inserted in an emergency situation. However, there are few reports available regarding which route of access is best or what the most preferred approach is currently in tertiary hospitals. This study aimed to compare procedure times, complication rates, and indications for temporary pacing between the transjugular and transfemoral approaches to TPM placement. We analyzed consecutive patients who underwent TPM placement. Indications; procedure times; and rates of complications including localized infection, any bleeding, and pacing wire repositioning rates were analyzed. A total of 732 patients (361 treated via the transjugular approach and 371 treated via the transfemoral approach) were included. Complete atrioventricular block was the most common cause of TPM placement in both groups, but sick sinus syndrome was especially common in the transjugular approach group. Separately, procedure time was significantly shorter in the transjugular approach group (9.0 ± 8.0 minutes vs. 11.9 ± 9.7 minutes; P < 0.001). Overall complication rates were not significantly different between the two groups, and longer duration of temporary pacing was a risk factor for repositioning. The risk of reposition was significantly increased when the temporary pacing was continued more than 5 days and 3 days in the transjugular approach group and the transfemoral approach group, respectively. The transjugular approach should be considered if the TPM is required for more than 3 days.

]]>
<![CDATA[Early budget impact analysis on magnetic seed localization for non-palpable breast cancer surgery]]> https://www.researchpad.co/article/elastic_article_13866 Current localization techniques used in breast conserving surgery for non-palpable tumors show several disadvantages. Magnetic Seed Localization (MSL) is an innovative localization technique aiming to overcome these disadvantages. This study evaluated the expected budget impact of adopting MSL compared to standard of care.MethodsStandard of care with Wire-Guided Localization (WGL) and Radioactive Seed Localization (RSL) use was compared with a future situation gradually adopting MSL next to RSL or WGL from a Dutch national perspective over 5 years (2017–2022). The intervention costs for WGL, RSL and MSL and the implementation costs for RSL and MSL were evaluated using activity-based costing in eight Dutch hospitals. Based on available list prices the price of the magnetic seed was ranged €100-€500.ResultsThe intervention costs for WGL, RSL and MSL were respectively: €2,617, €2,834 and €2,662 per patient and implementation costs were €2,974 and €26,826 for MSL and RSL respectively. For standard of care the budget impact increased from €14.7m to €16.9m. Inclusion of MSL with a seed price of €100 showed a budget impact of €16.7m. Above a price of €178 the budget impact increased for adoption of MSL, rising to €17.6m when priced at €500.ConclusionMSL could be a cost-efficient localization technique in resecting non-palpable tumors in the Netherlands. The online calculation model can inform adoption decisions internationally. When determining retail price of the magnetic seed, cost-effectiveness should be considered. ]]> <![CDATA[Is postural dysfunction related to sarcopenia? A population-based study]]> https://www.researchpad.co/article/elastic_article_7695 Postural dysfunction is one of the most common community health symptoms and frequent chief complaints in hospitals. Sarcopenia is a syndrome characterized by degenerative loss of skeletal muscle mass, muscle quality, and muscle strength, and is the main contributor to musculoskeletal impairment in the elderly. Previous studies reported that loss of muscle mass is associated with a loss of diverse functional abilities. Meanwhile, there have been limited studies concerning postural dysfunction among older adults with sarcopenia. Although sarcopenia is primarily a disease of the elderly, its development may be associated with conditions that are not exclusively seen in older persons. Also, recent studies recognize that sarcopenia may begin to develop earlier in life. The objective of this paper was to investigate the association between the prevalence of sarcopenia and postural dysfunction in a wide age range of adults using data from a nationally representative cohort study in Korea. Korean National Health & Nutrition Exhibition Survey V (KNHANES V, 2010–2012) data from the fifth cross-sectional survey of the South Korean population performed by the Korean Ministry of Health and Welfare were used. Appendicular skeletal muscle mass (ASM)/height (ht)2 was used to define sarcopenia, and the Modified Romberg test using a foam pad (“foam balance test”) was performed to evaluate postural dysfunction. ASM/ht2 was lower in women and significantly decreased with age in men. Subjects with sarcopenia were significantly more likely to fail the foam balance test, regardless of sex and age. Regression analysis showed a significant relationship between sarcopenia and postural dysfunction (OR: 2.544, 95% CI: 1.683–3.846, p<0.001). Multivariate regression analysis revealed that sarcopenia (OR: 1.747, 95% CI: 1.120–2.720, p = 0.014) and age (OR: 1.131, 95% CI: 1.105–1.158, p<0.001) are independent risk factors for postural instability. In middle age subjects, the adjusted OR for sarcopenia was 3.344 (95% CI: 1.350–8.285) (p = 0.009). The prevalence of postural dysfunction is higher in sarcopenia patients, independent of sex and age.

]]>
<![CDATA[Incidence and determinants of Implanon discontinuation: Findings from a prospective cohort study in three health zones in Kinshasa, DRC]]> https://www.researchpad.co/article/elastic_article_7634 Kinshasa is Africa's third largest city and one of the continent’s most rapidly growing urban areas. PMA2020 data showed that Kinshasa has a modern contraceptive prevalence of 26.5% among married women in 2018. In Kinshasa’s method mix, the contraceptive implant recently became the dominant method among contraceptive users married and in union. This study provides insight into patterns of implant use in a high-fertility setting by evaluating the 24-month continuation rate for Implanon NXT and identifying the characteristics associated with discontinuation.MethodologyThis community-based, prospective cohort study followed 531 Implanon users aged 18–49 years at 6, 12 and 24 months. The following information was collected: socio-demographic characteristics, Method Information Index (MII) and contraceptive history. The main outcome variable for this study was implant discontinuation. The incidence rate of discontinuation is presented as events per 1000 person/months (p-m), from the date of enrolment. The Cox proportional hazards modelling was used to measure predictors of discontinuation.ResultsA total of 9158.13 p-m were available for analysis, with an overall incidence rate of 9.06 (95% CI: 9.04–9.08) removals per 1000 p-m. Of nine possible co-variates tested, the likelihood of discontinuation was higher among women who lived in military camps, had less than three children, never used injectables or implants in the past, had experienced heavy/prolonged bleeding, and whose MII score was less than 3.ConclusionIn addition to four client characteristics that predicted discontinuation, we identified one programmatic factor: quality of counseling as measured by the Method Information Index. Community providers in similar contexts should pay more attention to clients having less than three children, new adopters, and to clients living military camps as underserved population, where clients have less access to health facilities. More targeted counselling and follow-up is needed, especially on bleeding patterns. ]]> <![CDATA[Highly efficient serum-free manipulation of miRNA in human NK cells without loss of viability or phenotypic alterations is accomplished with TransIT-TKO]]> https://www.researchpad.co/article/N4e6e8e95-63ae-420d-a6d7-c2f1aa3d99e6

Natural killer (NK) cells are innate lymphocytes with functions that include target cell killing, inflammation and regulation. NK cells integrate incoming activating and inhibitory signals through an array of germline-encoded receptors to gauge the health of neighbouring cells. The reactive potential of NK cells is influenced by microRNA (miRNA), small non-coding sequences that interfere with mRNA expression. miRNAs are highly conserved between species, and a single miRNA can have hundreds to thousands of targets and influence entire cellular programs. Two miRNA species, miR-155-5p and miR-146a-5p are known to be important in controlling NK cell function, but research to best understand the impacts of miRNA species within NK cells has been bottlenecked by a lack of techniques for altering miRNA concentrations efficiently and without off-target effects. Here, we describe a non-viral and straightforward approach for increasing or decreasing expression of miRNA in primary human NK cells. We achieve >90% transfection efficiency without off-target impacts on NK cell viability, education, phenotype or function. This opens the opportunity to study and manipulate NK cell miRNA profiles and their impacts on NK cellular programs which may influence outcomes of cancer, inflammation and autoimmunity.

]]>
<![CDATA[Clustered micronodules as predominant manifestation on CT: A sign of active but indolently evolving pulmonary tuberculosis]]> https://www.researchpad.co/article/N48b20e2f-c3ed-4c3c-a251-c583ed3c8c8a

Objective

To investigate the prevalence, patient characteristics, and natural history of clustered micronodules (CMs) in active pulmonary tuberculosis.

Materials and methods

From January 2013 through July 2018, 833 consecutive patients with bacteriologically or polymerase chain reaction–proven active pulmonary tuberculosis were retrospectively evaluated. CMs were defined as a localized aggregation of multiple dense discrete micronodules, which primarily distributed around small airways distal to the level of the segmental bronchus: small airways surrounded by CMs maintained luminal patency and the CMs might coalesce into a larger nodule. The patients were dichotomized according to whether the predominant computed tomography (CT) abnormalities were CMs. We analyzed radiologic and pathologic findings in patients whose predominant diagnostic CT abnormalities were CMs, along with those of incidental pre-diagnostic CT scans, if available. Chi-square, McNemar, Student t-test and Wilcoxon-signed rank test were performed.

Results

CMs were the predominant CT abnormality in 2.6% of the patients (22/833, 95% CI, 1.8–4.0%) with less sputum smear-positivity (4.8% vs 31.0%; p = .010) and a similar proportion of immunocompromised status (40.9% vs 46.0%; p = .637) than those without having CMs as the predominant CT abnormality. The time interval for minimal radiologic progression was 6.4 months. The extent of CMs increased with disease progression, frequently accompanied by consolidation and small airway wall thickening. Pathologically, smaller CMs were non-caseating granulomas confined to the peribronchiolar interstitium, whereas larger CMs were caseating granulomas involving lung parenchyma. Two of the five patients with a pre-diagnostic CT scan obtained more than 50 months pre-diagnosis showed an incipient stage of CMs, in which they were small peribronchiolar nodules.

Conclusion

Active pulmonary tuberculosis manifested predominantly as CMs in 2.6% of patients, with scarce of acid-fast bacilli smear-positivity and no association with impaired host immunity. CMs indolently progressed, accompanied by consolidation and small airway wall thickening, and originated from small nodules.

]]>
<![CDATA[Trajectories of fatigue among stroke patients from the acute phase to 18 months post-injury: A latent class analysis]]> https://www.researchpad.co/article/Nc2795b82-f9e4-46cc-9fc3-23c3f213e7d4

Introduction

Post-stroke fatigue (PSF) is a common symptom affecting 23–75% of stroke survivors. It is associated with increased risk of institutionalization and death, and it is of many patients considered among the worst symptoms to cope with after stroke. Longitudinal studies focusing on trajectories of fatigue may contribute to understanding patients’ experience of fatigue over time and its associated factors, yet only a few have been conducted to date.

Objectives

To explore whether subgroups of stroke survivors with distinct trajectories of fatigue in the first 18 months post stroke could be identified and whether these subgroups differ regarding sociodemographic, medical and/or symptom-related characteristics.

Materials and methods

115 patients with first-ever stroke admitted to Oslo University Hospital or Buskerud Hospital were recruited and data was collected prospectively during the acute phase and at 6, 12 and 18 months post stroke. Data on fatigue (both pre- and post-stroke), sociodemographic, medical and symptom-related characteristics were collected through structured interviews, standardized questionnaires and from the patients’ medical records.

Growth mixture modeling (GMM) was used to identify latent classes, i.e., subgroups of patients, based on their Fatigue Severity Scales (FSS) scores at the four time points. Differences in sociodemographic, medical, and symptom-related characteristics between the latent classes were evaluated using univariate and multivariable ordinal regression analyses.

Results and their significance

Using GMM, three latent classes of fatigue trajectories over 18 months were identified, characterized by differing levels of fatigue: low, moderate and high. The mean FSS score for each class remained relatively stable across all four time points. In the univariate analyses, age <75, pre-stroke fatigue, multiple comorbidities, current depression, disturbed sleep and some ADL impairment were associated with higher fatigue trajectories. In the multivariable analyses, pre-stroke fatigue (OR 4.92, 95% CI 1.84–13.2), multiple comorbidities (OR 4,52,95% CI 1.85–11.1) and not working (OR 4.61, 95% CI 1.36–15,7) were the strongest predictor of higher fatigue trajectories The findings of this study may be helpful for clinicians in identifying patients at risk of developing chronic fatigue after stroke.

]]>
<![CDATA[A new simple brain segmentation method for extracerebral intracranial tumors]]> https://www.researchpad.co/article/Nb837d809-9647-425d-8dfd-2c3174a6dd80

Normal brain segmentation is available via FreeSurfer, Vbm, and Ibaspm software. However, these software packages cannot perform segmentation of the brain for patients with brain tumors. As we know, damage from extracerebral tumors to the brain occurs mainly by way of pushing and compressing while leaving the structure of the brain intact. Three-dimensional (3D) imaging, augmented reality (AR), and virtual reality (VR) technology have begun to be applied in clinical practice. The free medical open-source software 3D Slicer allows us to perform 3D simulations on a computer and requires little user interaction. Moreover, 3D Slicer can integrate with the third-party software mentioned above. The relationship between the tumor and surrounding brain tissue can be judged, but accurate brain segmentation cannot be performed using 3D Slicer. In this study, we combine 3D Slicer and FreeSurfer to provide a novel brain segmentation method for extracerebral tumors. This method can help surgeons identify the “real” relationship between the lesion and adjacent brain tissue before surgery and improve preoperative planning.

]]>
<![CDATA[Association between boarding in the emergency department and in-hospital mortality: A systematic review]]> https://www.researchpad.co/article/N48ef4c13-827b-4694-911d-7d7581473712

Importance

Boarding in the emergency department (ED) is a critical indicator of quality of care for hospitals. It is defined as the time between the admission decision and departure from the ED. As a result of boarding, patients stay in the ED until inpatient beds are available; moreover, boarding is associated with various adverse events.

Study objective

The objective of our systematic review was to determine whether ED boarding (EDB) time is associated with in-hospital mortality (IHM).

Methods

A systematic search was conducted in academic databases to identify relevant studies. Medline, PubMed, Scopus, Embase, Cochrane, Web of Science, Cochrane, CINAHL and PsychInfo were searched. We included all peer-reviewed published studies from all previous years until November 2018. Studies performed in the ED and focused on the association between EDB and IHM as the primary objective were included. Extracted data included study characteristics, prognostic factors, outcomes, and IHM. A search update in PubMed was performed in May 2019 to ensure the inclusion of recent studies before publishing.

Results

From the initial 4,321 references found through the systematic search, the manual screening of reference lists and the updated search in PubMed, a total of 12 studies were identified as eligible for a descriptive analysis. Overall, six studies found an association between EDB and IHM, while five studies showed no association. The last remaining study included both ICU and non-ICU subgroups and showed conflicting results, with a positive association for non-ICU patients but no association for ICU patients. Overall, a tendency toward an association between EDB and IHM using the pool random effect was observed.

Conclusion

Our systematic review did not find a strong evidence for the association between ED boarding and IHM but there is a tendency toward this association. Further well-controlled, international multicenter studies are needed to demonstrate whether this association exists and whether there is a specific EDB time cut-off that results in increased IHM.

]]>
<![CDATA[Chalcone synthase (CHS) family members analysis from eggplant (Solanum melongena L.) in the flavonoid biosynthetic pathway and expression patterns in response to heat stress]]> https://www.researchpad.co/article/N0c4703df-5c43-4557-a077-ba839b092c8d

Enzymes of the chalcone synthase (CHS) family participate in the synthesis of multiple secondary metabolites in plants, fungi and bacteria. CHS showed a significant correlation with the accumulation patterns of anthocyanin. The peel color, which is primarily determined by the content of anthocyanin, is an economically important trait for eggplants that is affected by heat stress. A total of 7 CHS (SmCHS1-7) putative genes were identified in a genome-wide analysis of eggplants (S. melongena L.). The SmCHS genes were distributed on 7 scaffolds and were classified into 3 clusters. Phylogenetic relationship analysis showed that 73 CHS genes from 7 Solanaceae species were classified into 10 groups. SmCHS5, SmCHS6 and SmCHS7 were continuously down-regulated under 38°C and 45°C treatment, while SmCHS4 was up-regulated under 38°C but showed little change at 45°C in peel. Expression profiles of key anthocyanin biosynthesis gene families showed that the PAL, 4CL and AN11 genes were primarily expressed in all five tissues. The CHI, F3H, F3’5’H, DFR, 3GT and bHLH1 genes were expressed in flower and peel. Under heat stress, the expression level of 52 key genes were reduced. In contrast, the expression patterns of eight key genes similar to SmCHS4 were up-regulated at a treatment of 38°C for 3 hour. Comparative analysis of putative CHS protein evolutionary relationships, cis-regulatory elements, and regulatory networks indicated that SmCHS gene family has a conserved gene structure and functional diversification. SmCHS showed two or more expression patterns, these results of this study may facilitate further research to understand the regulatory mechanism governing peel color in eggplants.

]]>
<![CDATA[Lateral flow immunoassay (LFIA) for the detection of lethal amatoxins from mushrooms]]> https://www.researchpad.co/article/N089b971a-62b1-4256-a74f-acfba8aef66c

The mushroom poison that causes the most deaths is the class of toxins known as amatoxins. Current methods to sensitively and selectively detect these toxins are limited by the need for expensive equipment, or they lack accuracy due to cross-reactivity with other chemicals found in mushrooms. In this work, we report the development of a competition-based lateral flow immunoassay (LFIA) for the rapid, portable, selective, and sensitive detection of amatoxins. Our assay clearly indicates the presence of 10 ng/mL of α-AMA or γ-AMA and the method including extraction and detection can be completed in approximately 10 minutes. The test can be easily read by eye and has a presumed shelf-life of at least 1 year. From testing 110 wild mushrooms, the LFIA identified 6 out of 6 species that were known to contain amatoxins. Other poisonous mushrooms known not to contain amatoxins tested negative by LFIA. This LFIA can be used to quickly identify amatoxin-containing mushrooms.

]]>
<![CDATA[Antibiotic use for Australian Aboriginal children in three remote Northern Territory communities]]> https://www.researchpad.co/article/N999fa4e6-a15c-456a-862e-2e1ce88316a9

Objective

To describe antibiotic prescription rates for Australian Aboriginal children aged <2 years living in three remote Northern Territory communities.

Design

A retrospective cohort study using electronic health records.

Setting

Three primary health care centres located in the Katherine East region.

Participants

Consent was obtained from 149 mothers to extract data from 196 child records. There were 124 children born between January 2010 and July 2014 who resided in one of the three chosen communities and had electronic health records for their first two years of life.

Main outcome measures

Antibiotic prescription rates, factors associated with antibiotic prescription and factors associated with appropriate antibiotic prescription.

Results

There were 5,675 Primary Health Care (PHC) encounters for 124 children (median 41, IQR 25.5, 64). Of the 5,675 PHC encounters, 1,542 (27%) recorded at least one infection (total 1,777) and 1,330 (23%) had at least one antibiotic prescription recorded (total 1,468). Children had a median five (IQR 2, 9) prescriptions in both their first and second year of life, with a prescription rate of 5.99/person year (95% CI 5.35, 6.63). Acute otitis media was the most common infection (683 records, 38%) and Amoxycillin was the most commonly prescribed antibiotic (797 prescriptions, 54%). Of the 1,468 recorded prescriptions, 398 (27%) had no infection recorded and 116 (8%) with an infection recorded were not aligned with local treatment guidelines.

Conclusion

Prescription rates for Australian Aboriginal children in these communities are significantly higher than that reported nationally for non-Aboriginal Australians. Prescriptions predominantly aligned with treatment guidelines in this setting where there is a high burden of infectious disease.

]]>
<![CDATA[Association between cigarette smoking and the risk of dysmenorrhea: A meta-analysis of observational studies]]> https://www.researchpad.co/article/N299c77b6-7bcf-4190-8f14-766ff39e61a2

Background

Emerging studies have found inconsistent results on the potential relationship between cigarette smoking and dysmenorrhea. Therefore, the aim of this study was to quantitatively synthesize the previous findings on the preceding relationship using meta-analysis.

Methods

Previous studies on the association between cigarette smoking and dysmenorrhea, published not later than November 2019, were systematically searched, using MeSH heading and/or relevant terms, in the electronic databases of PubMed, Medline, Web of Science and EMBASE. The I2 statistic was used to assess heterogeneity, whose source was explored using subgroup analysis. A pooled effect size was obtained using random effects model, and sensitivity analysis was performed to assess the consistency of the pooled effect size.

Results

After a rigorous screening process, 24 studies involving 27,091 participants were included in this meta-analysis. The results indicated that smokers were 1.45 times more likely to develop dysmenorrhea than non-smokers (odds ratio (OR) = 1.45, 95% confidence interval (CI): 1.30–1.61). In addition, individuals classified as currently smoking were 1.50 times more likely to develop dysmenorrhea than those who were classified as never smoking (OR = 1.50, 95% CI: 1.33–1.70), whereas being a former smoker was 1.31 times more likely to develop dysmenorrhea than being a never smoker (OR = 1.31, 95% CI: 1.18–1.46). Sensitivity analysis showed that exclusion of any single study did not materially alter the overall combined effect.

Conclusion

The evidence from this meta-analysis indicated a significant association between cigarette smoking (both current and former smoking) and dysmenorrhea. The adverse effects of smoking provide further support for prevention of dysmenorrhea and emphasize the need to target women.

]]>
<![CDATA[Nocturnal hypercapnia with daytime normocapnia in patients with advanced pulmonary arterial hypertension awaiting lung transplantation]]> https://www.researchpad.co/article/Nacc6463a-eb28-4f4a-acf0-c81fc9df01f4

Background

Pulmonary arterial hypertension (PAH) is frequently complicated by sleep disordered breathing (SDB), and previous studies have largely focused on hypoxemic SDB. Even though nocturnal hypercapnia was shown to exacerbate pulmonary hypertension, the clinical significance of nocturnal hypercapnia among PAH patients has been scarcely investigated.

Method

Seventeen patients with PAH were identified from 246 consecutive patients referred to Kyoto University Hospital for the evaluation of lung transplant registration from January 2010 to December 2017. Included in this study were 13 patients whose nocturnal transcutaneous carbon dioxide partial pressure (PtcCO2) monitoring data were available. Nocturnal hypercapnia was diagnosed according to the guidelines of the American Academy of Sleep Medicine. Associations of nocturnal PtcCO2 measurements with clinical features, the findings of right heart catheterization and pulmonary function parameters were evaluated.

Results

Nocturnal hypercapnia was diagnosed in six patients (46.2%), while no patient had daytime hypercapnia. Of note, nocturnal hypercapnia was found for 5 out of 6 patients with idiopathic PAH (83.3%). Mean nocturnal PtcCO2 levels correlated negatively with the percentage of predicted total lung capacity (TLC), and positively with cardiac output and cardiac index.

Conclusion

Nocturnal hypercapnia was prevalent among advanced PAH patients who were waiting for lung transplantation, and associated with %TLC. Nocturnal hypercapnia was associated with the increase in cardiac output, which might potentially worsen pulmonary hypertension especially during sleep. Further studies are needed to investigate hemodynamics during sleep and to clarify whether nocturnal hypercapnia can be a therapeutic target for PAH patients.

]]>
<![CDATA[Toxin-neutralizing antibodies elicited by naturally acquired cutaneous anthrax are elevated following severe disease and appear to target conformational epitopes]]> https://www.researchpad.co/article/N0733fdcc-4c39-44e4-82cd-032e69d54dbc

Understanding immune responses to native antigens in response to natural infections can lead to improved approaches to vaccination. This study sought to characterize the humoral immune response to anthrax toxin components, capsule and spore antigens in individuals (n = 46) from the Kayseri and Malatya regions of Turkey who had recovered from mild or severe forms of cutaneous anthrax infection, compared to regional healthy controls (n = 20). IgG antibodies to each toxin component, the poly-γ-D-glutamic acid capsule, the Bacillus collagen-like protein of anthracis (BclA) spore antigen, and the spore carbohydrate anthrose, were detected in the cases, with anthrax toxin neutralization and responses to Protective Antigen (PA) and Lethal Factor (LF) being higher following severe forms of the disease. Significant correlative relationships among responses to PA, LF, Edema Factor (EF) and capsule were observed among the cases. Though some regional control sera exhibited binding to a subset of the tested antigens, these samples did not neutralize anthrax toxins and lacked correlative relationships among antigen binding specificities observed in the cases. Comparison of serum binding to overlapping decapeptides covering the entire length of PA, LF and EF proteins in 26 cases compared to 8 regional controls revealed that anthrax toxin-neutralizing antibody responses elicited following natural cutaneous anthrax infection are directed to conformational epitopes. These studies support the concept of vaccination approaches that preserve conformational epitopes.

]]>
<![CDATA[Host immune responses during Taenia solium Neurocysticercosis infection and treatment]]> https://www.researchpad.co/article/Nc0d0d75e-fba6-45d6-a2e4-1505f9de6f1c

Taenia solium cysticercosis and taeniasis (TSCT), caused by the tapeworm T. solium, is a foodborne and zoonotic disease classified since 2010 by WHO as a neglected tropical isease. It causes considerable impact on health and economy and is one of the leading causes of acquired epilepsy in most endemic countries of Latin America, Sub-Saharan Africa, and Asia. There is some evidence that the prevalence of TSCT in high-income countries has recently increased, mainly due to immigration from endemic areas. In regions endemic for TSCT, human cysticercosis can manifest clinically as neurocysticercosis (NCC), resulting in epileptic seizures and severe progressive headaches, amongst other neurological signs and/or symptoms. The development of these symptoms results from a complex interplay between anatomical cyst localization, environmental factors, parasite’s infective potential, host genetics, and, especially, host immune responses. Treatment of individuals with active NCC (presence of viable cerebral cysts) with anthelmintic drugs together with steroids is usually effective and, in the majority, reduces the number and/or size of cerebral lesions as well as the neurological symptoms. However, in some cases, treatment may profoundly enhance anthelmintic inflammatory responses with ensuing symptoms, which, otherwise, would have remained silent as long as the cysts are viable. This intriguing silencing process is not yet fully understood but may involve active modulation of host responses by cyst-derived immunomodulatory components released directly into the surrounding brain tissue or by the induction of regulatory networks including regulatory T cells (Treg) or regulatory B cells (Breg). These processes might be disturbed once the cysts undergo treatment-induced apoptosis and necrosis or in a coinfection setting such as HIV. Herein, we review the current literature regarding the immunology and pathogenesis of NCC with a highlight on the mobilization of immune cells during human NCC and their interaction with viable and degenerating cysticerci. Moreover, the immunological parameters associated with NCC in people living with HIV/AIDS and treatments are discussed. Eventually, we propose open questions to understand the role of the immune system and its impact in this intriguing host–parasite crosstalk.

]]>