Mothers' involvement in providing care for their hospitalised sick newborns in Kenya: a focused ethnographic account.
IntroductionThere is growing evidence that parental participation in the care of small and sick newborns benefits both babies and parents. While studies have investigated the roles that mothers play in newborn units in high income contexts (HIC), there is little exploration of how contextual factors interplay to influence the ways in which mothers participate in the care of their small and sick newborn babies in very resource constrained settings such as those found in many countries in sub-Saharan Africa.MethodsEthnographic methods (observations, informal conversations and formal interviews) were used to collect data during 627 h of fieldwork between March 2017 and August 2018 in the neonatal units of one government and one faith-based hospital in Kenya. Data were analysed using a modified grounded theory approach.ResultsThere were marked differences between the hospitals in the participation by mothers in the care of their sick newborn babies. The timing and types of caring task that the mothers undertook were shaped by the structural, economic and social context of the hospitals. In the resource constrained government funded hospital, the immediate informal and unplanned delegation of care to mothers was routine. In the faith-based hospital mothers were initially separated from their babies and introduced to bathing and diaper change tasks slowly under the close supervision of nurses. In both hospitals appropriate breast-feeding support was lacking, and the needs of the mothers were largely ignored.ConclusionIn highly resource constrained hospitals with low nurse to baby ratios, mothers are required to provide primary and some specialised care to their sick newborns with little information or support on how undertake the necessary tasks. In better resourced hospital settings, most caring tasks are initially performed by nurses leaving mothers feeling powerless and worried about their capacity to care for their babies after discharge. Interventions need to focus on how to better equip hospitals and nurses to support mothers in caring for their sick newborns, promoting family centred care.
Initial supplementary dose of dolutegravir in second-line antiretroviral therapy: a non-comparative, double-blind, randomised placebo-controlled trial.
BackgroundDolutegravir concentrations are reduced by efavirenz induction effect necessitating twice daily dolutegravir dosing when co-administered. Efavirenz induction persists for several weeks after stopping, which could potentially select for dolutegravir resistance if switching occurred with unsuppressed HIV-1 RNA levels and standard dolutegravir dosing. We evaluated the need for a lead-in supplementary dolutegravir dose in adults failing first-line tenofovir-emtricitabine-efavirenz (TEE).MethodsWe conducted a randomised, double-blind, placebo-controlled, phase 2 trial in Khayelitsha, South Africa. Eligible patients had virologic failure (two consecutive HIV-1 RNA≥1000 copies/mL) on first-line TEE. Participants were randomly assigned (1:1) to switch to tenofovir-lamivudine-dolutegravir (TLD) with a supplementary 50 mg dolutegravir dose or placebo taken 12 hours later for 14 days. Primary outcome was proportion with HIV-1 RNA<50 copies/mL at week 24. This study was not powered to compare arms.Results130 participants were randomised (65 to each arm). Median baseline HIV-1 RNA was 4.0 log10 copies/mL and 76% had baseline resistance to both tenofovir and lamivudine. One participant died and two were lost to follow-up. At week 24, 55/64 (86%, 95% confidence interval [CI], 75-93%) in the supplementary dolutegravir arm and 53/65 (82%, 95% CI, 70-90%) in the placebo arm had HIV-1 RNA<50 copies/mL. Grade 3 or 4 adverse events were similar in frequency between arms. None of six participants (3 in each arm) eligible for resistance testing by 24 weeks developed dolutegravir resistance.ConclusionsOur findings do not support the need for initial dolutegravir dose adjustment in patients switching to TLD who failed first-line TEE.
Type 2 inflammation and biological therapies in asthma: Targeted medicine taking flight
The field of asthma has undergone a dramatic change in recent years. Advances in our understanding of type 2 airway inflammation have driven the discovery of monoclonal antibodies targeting specific aspects of the immune pathway. In landmark trials, these drugs have shown efficacy in reducing asthma attacks and exposure to oral corticosteroids, important causes of morbidity in people with asthma. Our review explores the key features of type 2 inflammation in asthma and summarizes the clinical trial evidence of the novel monoclonal antibody treatments and future avenues for treatment.
Operationalisation of the Randomized Embedded Multifactorial Adaptive Platform for COVID-19 trials in a low and lower-middle income critical care learning health system.
The Randomized Embedded Multifactorial Adaptive Platform (REMAP-CAP) adapted for COVID-19) trial is a global adaptive platform trial of hospitalised patients with COVID-19. We describe implementation in three countries under the umbrella of the Wellcome supported Low and Middle Income Country (LMIC) critical care network: Collaboration for Research, Implementation and Training in Asia (CCA). The collaboration sought to overcome known barriers to multi centre-clinical trials in resource-limited settings. Methods described focused on six aspects of implementation: i, Strengthening an existing community of practice; ii, Remote study site recruitment, training and support; iii, Harmonising the REMAP CAP- COVID trial with existing care processes; iv, Embedding REMAP CAP- COVID case report form into the existing CCA registry platform, v, Context specific adaptation and data management; vi, Alignment with existing pandemic and critical care research in the CCA. Methods described here may enable other LMIC sites to participate as equal partners in international critical care trials of urgent public health importance, both during this pandemic and beyond.
A national landscaping survey of critical care services in hospitals accredited for training in a lower-middle income country: Pakistan
AbstractPurposeTo describe the extent and variation of critical care services in PakistanMaterials and methodsA cross-sectional survey was conducted in all CCUs recognised for postgraduate training to determine administration, infrastructure, equipment, staffing, and training.ResultsThere were 220 CCUs registered for training, providing 2166 CCU beds and 1473 ventilators. Regional distribution of CCU beds per 100,000 population ranged from 1.0 in Sindh to none in Gilgit Baltistan (median 0.7). A senior clinician trained in critical care was available in 19 (12.1%) of units, giving a ratio of one trained intensivist for every 82 CCU beds and 0.009 trained intensivists per 100,000 population. One to one nurse to bed ratio during the day was available in 84 (53.5%) of units, dropping to 75 (47.8%) at night. Availability of 1:1 nursing also varied between provinces, ranging from 56.5% in Punjab compared to 0% in Azad Jamu Kashmir. All CCUs had basic infrastructure (electricity, running water, piped oxygen) and basic equipment (electronic monitoring and infusion pumps).ConclusionPakistan, a lower middle-income country has an established network of critical care facilities with access to basic equipment, but inequalities in its distribution. Investment in critical care training for doctors and nurses is needed.
Salmonella Combination Vaccines: Moving Beyond Typhoid
Abstract There is now a robust pipeline of licensed and World Health Organization (WHO)–prequalified typhoid conjugate vaccines with a steady progression of national introductions. However, typhoid fever is responsible for less than half the total global burden of Salmonella disease, and even less among children aged <5 years. Invasive nontyphoidal Salmonella disease is the dominant clinical presentation of Salmonella in Africa, and over a quarter of enteric fever in Asia is due to paratyphoid A. In this article, we explore the case for combination Salmonella vaccines, review the current pipeline of these vaccines, and discuss key considerations for their development, including geographies of use, age of administration, and pathways to licensure. While a trivalent typhoid/nontyphoidal Salmonella vaccine is attractive for Africa, and a bivalent enteric fever vaccine for Asia, a quadrivalent vaccine covering the 4 main disease-causing serovars of Salmonella enterica would provide a single vaccine option for global Salmonella coverage.
Facilitating Safe Discharge Through Predicting Disease Progression in Moderate Coronavirus Disease 2019 (COVID-19): A Prospective Cohort Study to Develop and Validate a Clinical Prediction Model in Resource-Limited Settings
Abstract Background In locations where few people have received coronavirus disease 2019 (COVID-19) vaccines, health systems remain vulnerable to surges in severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections. Tools to identify patients suitable for community-based management are urgently needed. Methods We prospectively recruited adults presenting to 2 hospitals in India with moderate symptoms of laboratory-confirmed COVID-19 to develop and validate a clinical prediction model to rule out progression to supplemental oxygen requirement. The primary outcome was defined as any of the following: SpO2 < 94%; respiratory rate > 30 BPM; SpO2/FiO2 < 400; or death. We specified a priori that each model would contain three clinical parameters (age, sex, and SpO2) and 1 of 7 shortlisted biochemical biomarkers measurable using commercially available rapid tests (C-reactive protein [CRP], D-dimer, interleukin 6 [IL-6], neutrophil-to-lymphocyte ratio [NLR], procalcitonin [PCT], soluble triggering receptor expressed on myeloid cell-1 [sTREM-1], or soluble urokinase plasminogen activator receptor [suPAR]), to ensure the models would be suitable for resource-limited settings. We evaluated discrimination, calibration, and clinical utility of the models in a held-out temporal external validation cohort. Results In total, 426 participants were recruited, of whom 89 (21.0%) met the primary outcome; 257 participants comprised the development cohort, and 166 comprised the validation cohort. The 3 models containing NLR, suPAR, or IL-6 demonstrated promising discrimination (c-statistics: 0.72–0.74) and calibration (calibration slopes: 1.01–1.05) in the validation cohort and provided greater utility than a model containing the clinical parameters alone. Conclusions We present 3 clinical prediction models that could help clinicians identify patients with moderate COVID-19 suitable for community-based management. The models are readily implementable and of particular relevance for locations with limited resources.
Assessment of precision in growth inhibition assay (GIA) using human anti-PfRH5 antibodies.
BackgroundFor blood-stage malaria vaccine development, the in vitro growth inhibition assay (GIA) has been widely used to evaluate functionality of vaccine-induced antibodies (Ab), and Plasmodium falciparum reticulocyte-binding protein homolog 5 (RH5) is a leading blood-stage antigen. However, precision, also called "error of assay (EoA)", in GIA readouts and the source of EoA has not been evaluated systematically.MethodsIn the Main GIA experiment, 4 different cultures of P. falciparum 3D7 parasites were prepared with red blood cells (RBC) collected from 4 different donors. For each culture, 7 different anti-RH5 Ab (either monoclonal or polyclonal Ab) were tested by GIA at two concentrations on three different days (168 data points). To evaluate sources of EoA in % inhibition in GIA (%GIA), a linear model fit was conducted including donor (source of RBC) and day of GIA as independent variables. In addition, 180 human anti-RH5 polyclonal Ab were tested in a Clinical GIA experiment, where each Ab was tested at multiple concentrations in at least 3 independent GIAs using different RBCs (5,093 data points). The standard deviation (sd) in %GIA and in GIA50 (Ab concentration that gave 50%GIA) readouts, and impact of repeat assays on 95% confidence interval (95%CI) of these readouts was estimated.ResultsThe Main GIA experiment revealed that the RBC donor effect was much larger than the day effect, and an obvious donor effect was also observed in the Clinical GIA experiment. Both %GIA and log-transformed GIA50 data reasonably fit a constant sd model, and sd of %GIA and log-transformed GIA50 measurements were calculated as 7.54 and 0.206, respectively. Taking the average of three repeat assays (using three different RBCs) reduces the 95%CI width in %GIA or in GIA50 measurements by ~ half compared to a single assay.ConclusionsThe RBC donor effect (donor-to-donor variance on the same day) in GIA was much bigger than the day effect (day-to-day variance using the same donor's RBC) at least for the RH5 Ab evaluated in this study; thus, future GIA studies should consider the donor effect. In addition, the 95%CI for %GIA and GIA50 shown here help when comparing GIA results from different samples/groups/studies; therefore, this study supports future malaria blood-stage vaccine development.
Delayed boosting improves human antigen-specific Ig and B cell responses to the RH5.1/AS01B malaria vaccine.
Modifications to vaccine delivery that increase serum antibody longevity are of great interest for maximizing efficacy. We have previously shown that a delayed fractional (DFx) dosing schedule (0-1-6 month) - using AS01B-adjuvanted RH5.1 malaria antigen - substantially improves serum IgG durability as compared with monthly dosing (0-1-2 month; NCT02927145). However, the underlying mechanism and whether there are wider immunological changes with DFx dosing were unclear. Here, PfRH5-specific Ig and B cell responses were analyzed in depth through standardized ELISAs, flow cytometry, systems serology, and single-cell RNA-Seq (scRNA-Seq). Data indicate that DFx dosing increases the magnitude and durability of circulating PfRH5-specific B cells and serum IgG1. At the peak antibody magnitude, DFx dosing was distinguished by a systems serology feature set comprising increased FcRn binding, IgG avidity, and proportion of G2B and G2S2F IgG Fc glycans, alongside decreased IgG3, antibody-dependent complement deposition, and proportion of G1S1F IgG Fc glycan. Concomitantly, scRNA-Seq data show a higher CDR3 percentage of mutation from germline and decreased plasma cell gene expression in circulating PfRH5-specific B cells. Our data, therefore, reveal a profound impact of DFx dosing on the humoral response and suggest plausible mechanisms that could enhance antibody longevity, including improved FcRn binding by serum Ig and a potential shift in the underlying cellular response from circulating short-lived plasma cells to nonperipheral long-lived plasma cells.
The challenges of Plasmodium vivax human malaria infection models for vaccine development.
Controlled Human Malaria Infection models (CHMI) have been critical to advancing new vaccines for malaria. Stringent and safe preparation of a challenge agent is key to the success of any CHMI. Difficulty producing the Plasmodium vivax parasite in vitro has limited production of qualified parasites for CHMI as well as the functional assays required to screen and down-select candidate vaccines for this globally distributed parasite. This and other challenges to P. vivax CHMI (PvCHMI), including scientific, logistical, and ethical obstacles, are common to P. vivax research conducted in both non-endemic and endemic countries, with additional hurdles unique to each. The challenges of using CHMI for P. vivax vaccine development and evaluation, lessons learned from previous and ongoing clinical trials, and the way forward to effectively perform PvCHMI to support vaccine development, are discussed.
RH5.1-CyRPA-Ripr antigen combination vaccine shows little improvement over RH5.1 in a preclinical setting.
BackgroundRH5 is the leading vaccine candidate for the Plasmodium falciparum blood stage and has shown impact on parasite growth in the blood in a human clinical trial. RH5 binds to Ripr and CyRPA at the apical end of the invasive merozoite form, and this complex, designated RCR, is essential for entry into human erythrocytes. RH5 has advanced to human clinical trials, and the impact on parasite growth in the blood was encouraging but modest. This study assessed the potential of a protein-in-adjuvant blood stage malaria vaccine based on a combination of RH5, Ripr and CyRPA to provide improved neutralizing activity against P. falciparum in vitro.MethodsMice were immunized with the individual RCR antigens to down select the best performing adjuvant formulation and rats were immunized with the individual RCR antigens to select the correct antigen dose. A second cohort of rats were immunized with single, double and triple antigen combinations to assess immunogenicity and parasite neutralizing activity in growth inhibition assays.ResultsThe DPX® platform was identified as the best performing formulation in potentiating P. falciparum inhibitory antibody responses to these antigens. The three antigens derived from RH5, Ripr and CyRPA proteins formulated with DPX induced highly inhibitory parasite neutralising antibodies. Notably, RH5 either as a single antigen or in combination with Ripr and/or CyRPA, induced inhibitory antibodies that outperformed CyRPA, Ripr.ConclusionAn RCR combination vaccine may not induce substantially improved protective immunity as compared with RH5 as a single immunogen in a clinical setting and leaves the development pathway open for other antigens to be combined with RH5 as a next generation malaria vaccine.
Understanding the role of serological and clinical data on assessing the dynamic of malaria transmission: a case study of Bagamoyo district, Tanzania.
Introductionnaturally acquired blood-stage malaria antibodies and malaria clinical data have been reported to be useful in monitoring malaria change over time and as a marker of malaria exposure. This study assessed the total immunoglobulin G (IgG) levels to Plasmodium falciparum schizont among infants (5-17 months), estimated malaria incidence using routine health facility-based surveillance data and predicted trend relation between anti-schizont antibodies and malaria incidence in Bagamoyo.Methods252 serum samples were used for assessment of total IgG by enzyme-linked immunosorbent assay and results were expressed in arbitrary units (AU). 147/252 samples were collected in 2021 during a blood-stage malaria vaccine trial [ClinicalTrials.gov NCT04318002], and 105/252 were archived samples of malaria vaccine trial conducted in 2012 [ClinicalTrials.gov NCT00866619]. Malaria incidence was calculated from outpatient clinic data of malaria rapid test or blood smear positive results retrieved from District-Health-Information-Software-2 (DHIS2) between 2013 and 2020. Cross-sectional data from both studies were analysed using STATA version 14.Resultsthis study demonstrated a decline in total anti-schizont IgG levels from 490.21AU in 2012 to 97.07AU in 2021 which was related to a fall in incidence from 58.25 cases/1000 person-year in 2013 to 14.28 cases/1000 person-year in 2020. We also observed a significant difference in incidence when comparing high and low malaria transmission areas and by gender. However, we did not observe differences when comparing total anti-schizont antibodies by gender and study year.Conclusiontotal anti-schizont antibody levels appear to be an important serological marker of exposure for assessing the dynamic of malaria transmission in infants living in malaria-endemic regions.
COVID-19 vaccination, risk-compensatory behaviours, and contacts in the UK.
The physiological effects of vaccination against SARS-CoV-2 (COVID-19) are well documented, yet the behavioural effects not well known. Risk compensation suggests that gains in personal safety, as a result of vaccination, are offset by increases in risky behaviour, such as socialising, commuting and working outside the home. This is potentially important because transmission of SARS-CoV-2 is driven by contacts, which could be amplified by vaccine-related risk compensation. Here, we show that behaviours were overall unrelated to personal vaccination, but-adjusting for variation in mitigation policies-were responsive to the level of vaccination in the wider population: individuals in the UK were risk compensating when rates of vaccination were rising. This effect was observed across four nations of the UK, each of which varied policies autonomously.
Exploring the complex realities of nursing work in Kenya and how this shapes role enactment and practice-A qualitative study.
AimWe explore how nurses navigate competing work demands in resource-constrained settings and how this shapes the enactment of nursing roles.DesignAn exploratory-descriptive qualitative study.MethodsUsing individual in-depth interviews and small group interviews, we interviewed 47 purposively selected nurses and nurse managers. We also conducted 57 hours of non-participant structured observations of nursing work in three public hospitals.ResultsThree major themes arose: (i) Rationalization of prioritization decisions, where nurses described prioritizing technical nursing tasks over routine bedside care, coming up with their own 'working standards' of care and nurses informally delegating tasks to cope with work demands. (ii) Bundling of tasks describes how nurses were sometimes engaged in tasks seen to be out of their scope of work or sometimes being used to fill for other professional shortages. (iii) Pursuit of professional ideals describes how the reality of how nursing was practised was seen to be in contrast with nurses' quest for professionalism.
Evaluation of perturbed iron-homeostasis in a prospective cohort of patients with COVID-19.
Background: Marked reductions in serum iron concentrations are commonly induced during the acute phase of infection. This phenomenon, termed hypoferremia of inflammation, leads to inflammatory anemia, but could also have broader pathophysiological implications. In patients with coronavirus disease 2019 (COVID-19), hypoferremia is associated with disease severity and poorer outcomes, although there are few reported cohorts. Methods: In this study, we leverage a well characterised prospective cohort of hospitalised COVID-19 patients and perform a set of analyses focussing on iron and related biomarkers and both acute severity of COVID-19 and longer-term symptomatology. Results: We observed no associations between acute serum iron and long-term outcomes (including fatigue, breathlessness or quality of life); however, lower haemoglobin was associated with poorer quality of life. We also quantified iron homeostasis associated parameters, demonstrating that among 50 circulating mediators of inflammation IL-6 concentrations were strongly associated with serum iron, consistent with its central role in inflammatory control of iron homeostasis. Surprisingly, we observed no association between serum hepcidin and serum iron concentrations. We also observed elevated erythroferrone concentrations in COVID-19 patients with anaemia of inflammation. Conclusions: These results enhance our understanding of the regulation and pathophysiological consequences of disturbed iron homeostasis during SARS-CoV-2 infection.
Ongoing Strategies to Improve Antimicrobial Utilization in Hospitals across the Middle East and North Africa (MENA): Findings and Implications.
Antimicrobial resistance (AMR) is an increasing global concern, increasing costs, morbidity, and mortality. National action plans (NAPs) to minimize AMR are one of several global and national initiatives to slow down rising AMR rates. NAPs are also helping key stakeholders understand current antimicrobial utilization patterns and resistance rates. The Middle East is no exception, with high AMR rates. Antibiotic point prevalence surveys (PPS) provide a better understanding of existing antimicrobial consumption trends in hospitals and assist with the subsequent implementation of antimicrobial stewardship programs (ASPs). These are important NAP activities. We examined current hospital consumption trends across the Middle East along with documented ASPs. A narrative assessment of 24 PPS studies in the region found that, on average, more than 50% of in-patients received antibiotics, with Jordan having the highest rate of 98.1%. Published studies ranged in size from a single to 18 hospitals. The most prescribed antibiotics were ceftriaxone, metronidazole, and penicillin. In addition, significant postoperative antibiotic prescribing lasting up to five days or longer was common to avoid surgical site infections. These findings have resulted in a variety of suggested short-, medium-, and long-term actions among key stakeholders, including governments and healthcare workers, to improve and sustain future antibiotic prescribing in order to decrease AMR throughout the Middle East.
Neglected tropical diseases in Republic of Guinea: disease endemicity, case burden and the road towards the 2030 target
Abstract Neglected tropical diseases (NTDs) predominantly affect vulnerable and marginalized populations in tropical and subtropical areas and globally affect more than one billion people. In Guinea, the burden of NTDs is estimated to be >7.5 disability-adjusted life years per million inhabitants. Currently the Guinea NTDs master plan (2017–2020) has identified eight diseases as public health problems: onchocerciasis, lymphatic filariasis, trachoma, schistosomiasis and soil-transmitted helminthiasis, leprosy, human African trypanosomiasis and Buruli ulcer. In this review we discuss the past and the current case burden of the priority NTDs in Guinea, highlight the major milestones and discuss current and future areas of focus for achieving the 2030 target outlined by the World Health Organization.
Community engagement for malaria elimination in the Greater Mekong Sub-region: a qualitative study among malaria researchers and policymakers.
BackgroundCommunity engagement has increasingly received attention in malaria research and programme interventions, particularly as countries aim for malaria elimination. Although community engagement strategies and activities are constantly developing, little is known about how those who implement research or programmes view community engagement. This article explores the perspectives of researchers and policy makers in the Greater Mekong Sub-region (GMS) on community engagement for malaria control and elimination.MethodsSemi-structured interviews were conducted among 17 policymakers and 15 senior researchers working in the field of malaria. All interviews were audio-recorded and transcribed in English. Transcribed data were analysed using deductive and inductive approaches in QSR NVivo. Themes and sub-themes were generated.ResultsResearchers and policymakers emphasized the importance of community engagement in promoting participation in malaria research and interventions. Building trust with the community was seen as crucial. Respondents emphasized involving authority/leadership structures and highlighted the need for intense and participatory engagement. Geographic remoteness, social, cultural, and linguistic diversity were identified as barriers to meaningful engagement. Local staff were described as an essential 'connect' between researchers or policymakers and prospective participants. Sharing information with community members, using various strategies including creative and participatory methods were highlighted.ConclusionsPolicymakers and researchers involved in malaria prevention and control in the GMS viewed community engagement as crucial for promoting participation in research or programmatic interventions. Given the difficulties of the 'last mile' to elimination, sustained investment in community engagement is needed in isolated areas of the GMS where malaria transmission continues. Involving community-based malaria workers is ever more critical to ensure the elimination efforts engage hard-to-reach populations in remote areas of GMS.
Artemisinin and multidrug-resistant Plasmodium falciparum - a threat for malaria control and elimination.
Purpose of reviewArtemisinin-based combination therapies (ACTs) are globally the first-line treatment for uncomplicated falciparum malaria and new compounds will not be available within the next few years. Artemisinin-resistant Plasmodium falciparum emerged over a decade ago in the Greater Mekong Subregion (GMS) and, compounded by ACT partner drug resistance, has caused significant ACT treatment failure. This review provides an update on the epidemiology, and mechanisms of artemisinin resistance and approaches to counter multidrug-resistant falciparum malaria.Recent findingsAn aggressive malaria elimination programme in the GMS has helped prevent the spread of drug resistance to neighbouring countries. However, parasites carrying artemisinin resistance-associated mutations in the P. falciparum Kelch13 gene (pfk13) have now emerged independently in multiple locations elsewhere in Asia, Africa and South America. Notably, artemisinin-resistant infections with parasites carrying the pfk13 R561H mutation have emerged and spread in Rwanda.SummaryEnhancing the geographic coverage of surveillance for resistance will be key to ensure prompt detection of emerging resistance in order to implement effective countermeasures without delay. Treatment strategies designed to prevent the emergence and spread of multidrug resistance must be considered, including deployment of triple drug combination therapies and multiple first-line therapies.