Search results
Found 41535 matches for
A new typhoid vaccine for both adults and children has been proven by Oxford researchers, including Prof Brian Angus, to be safe and effective in preventing the disease.
Acute Plasmodium yoelii 17XNL Infection During BCG Vaccination Limits T Cell Responses and Mycobacterial Growth Inhibition.
Tuberculosis and malaria overlap in many sub-Saharan African countries where Bacillus Calmette Guérin (BCG) vaccination is routinely administered. The aim of this study was to determine whether the timing of BCG vaccination in relation to a malaria infection has implications for BCG vaccine efficacy. Mice were intradermally vaccinated with BCG either 4 weeks before infection with blood-stage Plasmodium yoelii 17XNL, at 13 days post-infection (during an acute blood-stage malaria infection) or 21 days post-infection (after clearance of P. yoelii 17XNL infection). Ex vivo control of mycobacterial growth by splenocytes was used as a surrogate of protective efficacy, and PPD-specific T-cell responses were quantified by flow cytometry. No differences in mycobacterial growth control were detected between BCG vaccinated mice and groups receiving vaccination prior to or after clearance of P. yoelii 17XNL infection. Poorer control of mycobacterial growth was observed following BCG vaccination administered during an acute malarial infection compared to BCG vaccination only or BCG vaccination after blood-stage malaria infection, and mycobacterial growth negatively correlated with the magnitude of total cytokine production from PPD-specific CD4+ T cells (p
Vaccine effects on in-hospital COVID-19 outcomes.
Here, we posit that studies comparing outcomes of patients hospitalized with COVID-19 by vaccination status are important descriptive epidemiologic studies, but contrast two groups that are not comparable with regard to causal analyses. We use the principal stratification framework to show that these studies can estimate a causal vaccine effect only for the subgroup of individuals who would be hospitalized with or without vaccination. Further, we describe the methodology for, and present sensitivity analyses of, this effect. Using this approach can change the interpretation of studies only reporting the standard analyses that condition on observed hospital admission status - that is, analyses comparing outcomes for all hospitalised COVID-19 patients by vaccination status.
Prevalence of hepatitis B virus infection among pregnant women and cord blood hepatitis B surface antigen positive newborns in sub-Saharan Africa and South Asia
Background: Newborns infected with Hepatitis B Virus (HBV) are at risk of chronic liver disease and hepatocellular carcinoma. Objectives: This study investigated the prevalence of HBV infection among pregnant women and cord blood Hepatitis B surface antigen (HBsAg) positivity of their newborns in Bangladesh, Bhutan, India, Ethiopia, Mozambique, Kenya, Nigeria, Mali, and South Africa. Study design: Randomly selected paired maternal and cord blood samples (n = 101 each site) taken at delivery were tested for HBsAg and Hepatitis B extractable antigen (HBeAg) in the women using a chemiluminescent microparticle immunoassay. Similarly, cord blood sample of newborn was assessed for HBsAg reactivity. HBV DNA was quantified using the Xpert® HBV viral load assay, followed by genotyping. Results: The overall prevalence of maternal HBsAg positivity was 5.5 % (95 %CI: 0.4 %–7.1 %; n = 50/909). HBsAg positivity was higher in African countries (7.3 %; 95 %CI: 5.4 %–9.6 %; n = 44/606) compared to South Asian countries (2.0 %; 95 %CI: 0.8 %–4.3 %; n = 6/303; p = 0.002). Relative to South Africa, there were higher odds of HBsAg sero-positivity in women from Mozambique ((aOR): 7.7, 95 %CI: 1.6 %–37.8 %) and Mali (aOR: 5.7; 95 %CI: 1.1 %–29.7 %). The rate of HBsAg positivity in cord blood of babies born to HBsAg positive women was 28.0 % (95 %CI: 17.1 %–42.3 %; n = 14/50), including 31.8 % (95 %CI: 19.5–47.4 %; n = 14/44) in African countries. No cord blood HBsAg positivity was observed in South Asia. Genotypic analysis revealed HBV genotypes A (41.7 %) and E (58.3 %) were pre-dominant. Conclusion: The high rate of cord blood positivity (28.0 %) for HBsAg underscores the urgency of enhancing HBV prevention strategies to meet the World Health Organization's target of a 90 % reduction in new HBV infections by 2030.
Using the microbiota to study connectivity at human–animal interfaces
Interfaces between humans, livestock, and wildlife, mediated by the environment, are critical points for the transmission and emergence of infectious pathogens and call for leveraging the One Health approach to understanding disease transmission. Current research on pathogen transmission often focuses on single-pathogen systems, providing a limited understanding of the broader microbial interactions occurring at these interfaces. In this review, we make a case for the study of host-associated microbiota for understanding connectivity between host populations at human–animal interfaces. First, we emphasize the need to understand changes in microbiota composition dynamics from interspecies contact. Then, we explore the potential for microbiota monitoring at such interfaces as a predictive tool for infectious disease transmission and as an early-warning system to inform public health interventions. We discuss the methodological challenges and gaps in knowledge in analyzing microbiota composition dynamics, the functional meaning of these changes, and how to establish causality between microbiota changes and health outcomes. We posit that integrating microbiota science with social-ecological systems modeling is essential for advancing our ability to manage health risks and harness opportunities arising from interspecies interactions.
CD56bright natural killer cells preferentially kill proliferating CD4+ T cells
Abstract Human CD56br natural killer (NK) cells represent a small subset of CD56+ NK cells in circulation and are largely tissue-resident. The frequency and number of CD56br NK cells in blood has been shown to increase following administration of low-dose IL-2 (LD-IL2), a therapy aimed to specifically expand CD4+ regulatory T cells (Tregs). Given the potential clinical application of LD-IL-2 immunotherapy across several immune diseases, including the autoimmune disease type 1 diabetes, a better understanding of the functional consequences of this expansion is urgently needed. In this study, we developed an in vitro co-culture assay with activated CD4+ T cells to measure NK cell killing efficiency. We show that CD56br and CD56dim NK cells show similar efficiency at killing activated CD4+ conventional T (Tconv) and Treg cell subsets. However, in contrast to CD56dim cells, CD56br NK cells preferentially target highly proliferative cells. We hypothesize that CD56br NK cells have an immunoregulatory role through the elimination of proliferating autoreactive CD4+ Tconv cells that have escaped Treg suppression. These results have implications for the interpretation of current and future trials of LD-IL-2 by providing evidence for a new, possibly beneficial immunomodulatory mechanism of LD-IL-2-expanded CD56br NK cells.
External validation of the LENT and PROMISE prognostic scores for malignant pleural effusion.
BACKGROUND: Accurate survival estimation in malignant pleural effusion is essential to guide clinical management strategies and inform patient discussion. The LENT and PROMISE scores were developed to aid prognostication in malignant pleural effusion; however their uptake in practice has been limited. We aimed to conduct a detailed external validation of the LENT and PROMISE scores to develop recommendations regarding clinical utility, and to highlight factors limiting performance. METHODS: Medical records of patients diagnosed with malignant pleural effusion between 2015-2023 at Oxford University Hospitals were retrospectively reviewed to determine length of survival and the LENT and PROMISE scores at diagnosis. Performance of the scores in predicting overall survival and chance of survival at 3, 6 and 12 months was assessed using measures of discrimination, calibration and overall model performance. Kaplan-Meier analysis and Cox models were utilised to further investigate individual score variables. RESULTS: 773 patients with malignant pleural effusion were included. Both scores showed predictive ability for overall survival; however median survival estimates lacked precision. Score performance in predicting survival at 3, 6 and 12 months was stronger, with C-indices around 0.8 for both at each time point, and the models appearing well calibrated. Limited stratification of tumour types and lack of consideration of sensitising mutations were demonstrated to be potential factors restricting performance. CONCLUSIONS: Both scores have the ability to prognosticate in malignant pleural effusion, and greater use in practice should be considered. However, areas to improve score performance were also highlighted, and these may aid future model development.
In vivo labelling resolves distinct temporal, spatial, and functional properties of tumour macrophages, and identifies subset-specific effects of PD-L1 blockade.
Tumour-associated macrophages (TAMs) are a universal feature of cancers but variably influence outcome and treatment responses. Here, we used a photoconvertible mouse to distinguish newly entering, monocyte-derived (md)TAMs that were enriched at the tumour core, from resident-like (r)TAMs that localised with fibroblasts at the tumour-normal interface. The mdTAM pool was highly dynamic and continually replenished by circulating monocytes. Upon tumour entry, these monocytes differentiated down two divergent fate trajectories distinguished by the expression of MHC class II. MHC-II+ mdTAMs were functionally distinct from MHC-II- mdTAMs, demonstrating increased capacity for endocytosis and FcγR-mediated phagocytosis, as well as pro-inflammatory cytokine production. Both mdTAM subsets showed reduced expression of inflammatory transcripts and increased expression of PD-L1 with increasing tumour dwell-time. Treatment with anti-PD-L1 skewed mdTAM differentiation towards the MHC-II+ fate and attenuated the anti-inflammatory effects of the tumour environment. Anti-PD-L1 enhanced mdTAM-CD4+ T-cell interactions, establishing an IFNγ-CXCL9/10-dependent positive feedback loop. Altogether, these data resolve distinct temporal, spatial and functional properties of TAMs, and provide evidence of subset-specific effects of PD-L1 blockade.
The TyphiNET data visualisation dashboard: unlocking Salmonella Typhi genomics data to support public health
Abstract Background Salmonella enterica subspecies enterica serovar Typhi (abbreviated as ‘Typhi’) is the bacterial agent of typhoid fever. Effective antimicrobial therapy reduces complications and mortality; however, antimicrobial resistance (AMR) is a major problem in many endemic countries. Prevention through vaccination is possible through recently-licensed typhoid conjugate vaccines (TCVs). National immunisation programs are currently being considered or deployed in several countries where AMR prevalence is known to be high, and the Gavi vaccine alliance has provided financial support for their introduction. Pathogen whole genome sequence data are a rich source of information on Typhi variants (genotypes or lineages), AMR prevalence, and mechanisms. However, this information is currently not readily accessible to non-genomics experts, including those driving vaccine implementation or empirical therapy guidance. Results We developed TyphiNET (https://www.typhi.net), an interactive online dashboard for exploring Typhi genotype and AMR distributions derived from publicly available pathogen genome sequences. TyphiNET allows users to explore country-level summaries such as the frequency of pathogen lineages, temporal trends in resistance to clinically relevant antimicrobials, and the specific variants and mechanisms underlying emergent AMR trends. User-driven plots and session reports can be downloaded for ease of sharing. Importantly, TyphiNET is populated by high-quality genome data curated by the Global Typhoid Pathogen Genomics Consortium, analysed using the Pathogenwatch platform, and identified as coming from non-targeted sampling frames that are suitable for estimating AMR prevalence amongst Typhi infections (no personal data is included in the platform). As of February 2024, data from a total of n = 11,836 genomes from 101 countries are available in TyphiNET. We outline case studies illustrating how the dashboard can be used to explore these data and gain insights of relevance to both researchers and public health policy-makers. Conclusions The TyphiNET dashboard provides an interactive platform for accessing genome-derived data on pathogen variant frequencies to inform typhoid control and intervention strategies. The platform is extensible in terms of both data and features, and provides a model for making complex bacterial genome-derived data accessible to a wide audience.
Unveiling sub-populations in critical care settings: a real-world data approach in COVID-19.
BackgroundDisease presentation and progression can vary greatly in heterogeneous diseases, such as COVID-19, with variability in patient outcomes, even within the hospital setting. This variability underscores the need for tailored treatment approaches based on distinct clinical subgroups.ObjectivesThis study aimed to identify COVID-19 patient subgroups with unique clinical characteristics using real-world data (RWD) from electronic health records (EHRs) to inform individualized treatment plans.Materials and methodsA Factor Analysis of Mixed Data (FAMD)-based agglomerative hierarchical clustering approach was employed to analyze the real-world data, enabling the identification of distinct patient subgroups. Statistical tests evaluated cluster differences, and machine learning models classified the identified subgroups.ResultsThree clusters of COVID-19 in patients with unique clinical characteristics were identified. The analysis revealed significant differences in hospital stay durations and survival rates among the clusters, with more severe clinical features correlating with worse prognoses and machine learning classifiers achieving high accuracy in subgroup identification.ConclusionBy leveraging RWD and advanced clustering techniques, the study provides insights into the heterogeneity of COVID-19 presentations. The findings support the development of classification models that can inform more individualized and effective treatment plans, improving patient outcomes in the future.
Simplifying medicine dosing for children by harmonising weight bands across therapeutic areas.
Generally, dose recommendations for children are expressed as fixed dosing increments related to bodyweight, known as weight bands. The weight bands recommended in WHO treatment guidelines vary between diseases, leading to complexity and potential dosing errors when treating children for multiple diseases simultaneously. The introduction of a harmonised weight banding approach for orally administered drugs across disease areas could streamline dosing for young children, but implementing such an approach would require changes in current dosing recommendations. In this Health Policy, we describe the process we conducted to: identify therapeutic areas for harmonisation of weight bands; propose a harmonised weight-banding system to align with current use of weight bands in antibiotic guidance; and simulate the expected effect of dose adjustments due to weight-band harmonisation. Each step of this process, along with the effect and feasibility of weight-band harmonisation was discussed with clinical, policy, and pharmacology experts convened by WHO, representing four therapeutic areas: tuberculosis, HIV, malaria, and hepatitis C. Dosing according to harmonised weight bands across the targeted therapeutic areas was found to be feasible and should be considered for implementation by WHO disease programmes through their appropriate normative processes.
Iron deficiency causes aspartate-sensitive dysfunction in CD8+ T cells.
Iron is an irreplaceable co-factor for metabolism. Iron deficiency affects >1 billion people and decreased iron availability impairs immunity. Nevertheless, how iron deprivation impacts immune cell function remains poorly characterised. We interrogate how physiologically low iron availability affects CD8+ T cell metabolism and function, using multi-omic and metabolic labelling approaches. Iron limitation does not substantially alter initial post-activation increases in cell size and CD25 upregulation. However, low iron profoundly stalls proliferation (without influencing cell viability), alters histone methylation status, gene expression, and disrupts mitochondrial membrane potential. Glucose and glutamine metabolism in the TCA cycle is limited and partially reverses to a reductive trajectory. Previous studies identified mitochondria-derived aspartate as crucial for proliferation of transformed cells. Despite aberrant TCA cycling, aspartate is increased in stalled iron deficient CD8+ T cells but is not utilised for nucleotide synthesis, likely due to trapping within depolarised mitochondria. Exogenous aspartate markedly rescues expansion and some functions of severely iron-deficient CD8+ T cells. Overall, iron scarcity creates a mitochondrial-located metabolic bottleneck, which is bypassed by supplying inhibited biochemical processes with aspartate. These findings reveal molecular consequences of iron deficiency for CD8+ T cell function, providing mechanistic insight into the basis for immune impairment during iron deficiency.
A roadmap of priority evidence gaps for the co-implementation of malaria vaccines and perennial malaria chemoprevention
Progress in malaria control will rely on deployment and effective targeting of combinations of interventions, including malaria vaccines and perennial malaria chemoprevention (PMC). Several countries with PMC programmes have introduced malaria vaccination into their essential programmes on immunizations, but empirical evidence on the impact of combining these two interventions and how best to co-implement them are lacking. At the American Society of Tropical Medicine and Hygiene 2023 annual meeting, a stakeholder meeting was convened to identify key policy, operational and research gaps for co-implementation of malaria vaccines and PMC. Participants from 11 endemic countries, including representatives from national malaria and immunization programmes, the World Health Organization, researchers, implementing organizations and funders attended. Identified evidence gaps were prioritized to select urgent issues to inform co-implementation. The output of these activities is a strategic roadmap of priority malaria vaccine and PMC co-implementation evidence gaps, and solutions to address them. The roadmap was presented to stakeholders for feedback at the 2024 Multilateral Initiative on Malaria meeting and revised accordingly. The roadmap outlines four key areas of work to address urgent evidence gaps for co-implementation: (1) support to the global and national policy process, (2) implementation support and research, (3) clinical studies, and (4) modelling. Together, these areas will provide practical guidance on the co-implementation of the interventions, and robust evidence to inform decision-making on how best to design, optimize and scale-up co-implementation in different contexts, including if and in what contexts the co-implementation is cost-effective, and the optimal schedule for co-implementation. This will work towards supporting the policy process on co-implementation of malaria vaccines and PMC, and achieving the most impactful use of available resources for the prevention of malaria in children.
Handheld Spatially Offset Raman Spectroscopy for rapid non-invasive detection of ethylene glycol and diethylene glycol in medicinal syrups.
We investigate the potential of Spatially Offset Raman Spectroscopy (SORS) as a rapid, non-invasive screening tool deployable in the field to detect diethylene glycol (DEG) and ethylene glycol (EG) in medicinal syrups within closed containers. Measurements were performed on neat propylene glycol (PG) and glycerol, key components of many medicinal syrups, as well as marketed medicinal syrup formulations spiked with DEG and EG at various concentration levels to assess the technique's limit of detection in real-life samples. SORS was able to detect these down to ∼0.5 % concentration level in neat PG for both DEG and EG compounds and ∼1 % concentration level for DEG and EG in neat glycerol. The DEG and EG detection thresholds for the marketed formulations measured through original bottles was ∼1 %, for Benylin (active ingredient: Glycerol) and Piriteze (active ingredient: Cetirizine Hydrochloride). For Calpol (active ingredient: Paracetamol) the detection limit was higher, ∼2 % for EG and ∼5 % for DEG. Although not reaching the International Pharmacopeial 0.1 % detection threshold currently required for purity checks for human consumption, the method can still be used to detect products where DEG or EG has been wrongly used instead of PG or glycerol or if present in large quantities. The technique could also be used for raw material identification testing to ensure no mislabelling has occurred in pre-production stages and as a screening device in distribution chains to detect major deviations from permitted content in non-diffusely scattering, clear formulations, to help prevent serious adverse outcomes, such as acute renal failure and deaths.
Virion Structure
Picornaviruses were the first animal viruses whose structure was determined in atomic detail and, as of October 2009, the Protein Data Bank (PDB) registered 53 structure depositions for picornaviruses. These data have contributed significantly to the understanding of picornavirus evolution, assembly, host-cell interaction, host adaptation, and antigenic variation and are providing the basis for novel therapeutic strategies. Subsequently classified as a picornavirus, the general morphology of FMDV could not be visualized until the advent of the electron microscope, when negative-stained images to a resolution of 4 to 5 nm revealed rather smooth round particles of ˜30 nm diameter. The current classification of picornaviruses is based on genome and protein sequence properties which are derived from the interplay of the error-prone replication mechanism of the virus with the process of natural selection. Differences in physical properties, such as buoyant density in cesium chloride and pH stability, underpinned the early classification of picornaviruses. Virus capsids recognize susceptible cells by attachment to specific receptors on the host cell membrane, thereby determining the host range and tropism of infection. The majority of antibodies are weak neutralizers that appear to operate by using the two arms of the antibody to cross-link different virus particles, causing aggregation.