(Ann) Sarah Walker

Research Area: Bioinformatics & Stats (inc. Modelling and Computational Biology)
Technology Exchange: Medical statistics
Scientific Themes: Clinical Trials & Epidemiology and Immunology & Infectious Disease

Sarah has worked with the Crook group since April 2006, originally on secondment from the MRC Clinical Trials Unit, but part-time with the University of Oxford from December 2011. Her work includes the design and analysis of studies investigating the epidemiology and management of infectious diseases (including healthcare-associated infections) and antimicrobial resistance, with a particular focus on 'big data' from routinely collected electronic health records. She co-leads the “Modernising Medical Microbiology” Consortium translating new whole genome sequencing and informatics approaches into microbiology practice and service with Professors Crook and (Tim) Peto.

She was instrumental in obtaining ethical and regulatory approvals for a large anonymised linked database of hospital admissions and microbiology/laboratory data (Infections in Oxfordshire Research Database, IORD), and now leads analyses investigating aspects of epidemiology and management of infectious diseases in IORD. She has also made substantial contributions to the design and analyses of a programme of retrospective and evaluative prospective studies designed to move whole genome pathogen sequencing into routine clinical practice, particularly for Clostridium difficile (NEJM 2013), Staphylococcus aureus (CID 2013) and Mycobacterium tuberculosis (Lancet Respiratory Medicine 2012/2013); and also linking large-scale molecular and traditional epidemiology (e.g. for C. difficile (LID 2017) and Esherichia coli (LID 2018)).

Sarah is also an NIHR Senior Investigator, Fellow of the Academy of Medical Sciences and Associate Statistical Editor for the Journal of Infectious Diseases.

Name Department Institution Country
Derrick Crook Experimental Medicine Division Oxford University, John Radcliffe Hospital United Kingdom
Tim Peto Experimental Medicine Division Oxford University, John Radcliffe Hospital United Kingdom
Daniel Wilson Experimental Medicine Division Oxford University, John Radcliffe Hospital United Kingdom
Zamin Iqbal Wellcome Trust Centre for Human Genetics Oxford University, Henry Wellcome Building of Genomic Medicine United Kingdom
Guy Thwaites Tropical Medicine Oxford University, Ho Chi Minh City Vietnam
Philip Bejon Tropical Medicine Oxford University, Kilifi Kenya
Nicole Stoesser DPhil student Experimental Medicine Division Oxford University, John Radcliffe Hospital United Kingdom
Professor Mark Wilcox Leeds University United Kingdom
Dr Martin Llewelyn Brighton and Sussex Medical School United Kingdom
Dr Grace Smith PHE West Midlands Public Health Laboratory, Birmingham United Kingdom
Professor Mike Sharland St Georges University of London United Kingdom
Professor Lucy Yardley University of Southampton United Kingdom
Professor Chris Butler Department of Primary Care and Health Sciences, Oxford United Kingdom
Dr Sarah Wordsworth Health Economics Research Centre University of Oxford United Kingdom
David Wyllie Jenner Institute Oxford University, Henry Wellcome Building for Molecular Physiology United Kingdom
Professor Mark Bailey Centre for Ecology and Hydrology United Kingdom
Dr Muna Anjum Animal and Plant Health Agency, Weybridge United Kingdom
Professor Robert Sebra Mount Sinai Hospital United States
Professor Angela Kearns Public Health England, Colindale United Kingdom
Professor Alan Johnson Public Health England, Colindale United Kingdom
Professor Neil Woodford Public Health England, Colindale United Kingdom
Dr Julie Robotham Public Health England, Colindale United Kingdom
Professor Jim Davies Department of Computer Sciences, Oxford University United Kingdom
Professor Alison Holmes Imperial College London United Kingdom
Professor Ajit Lalvani Tuberculosis Research Unit, National Heart and Lung Institute Imperial College United Kingdom
Professor David Clifton Department of Engineering Science,University of Oxford United Kingdom
Professor David Moore London School of Hygiene and Tropical Medicine United Kingdom
Professor Daniela Maria Cirillo Ospedale San Raffaele Scientific Institute, Milano Italy
Professor Guangxue He Chinese Center for Disease Control and Prevention, Beijing China
Dr Camilla Rodrigues Hinduja Hospital, Mumbai India
Dr Nazir Ismail National Institute for Communicable Diseases, Johannesburg South Africa
Dr James Posey Centers for Disease Control and Prevention, Atlanta United States
Dr Nerges Mistry Foundation for Medical Research, Mumbai India
Professor Stefan Niemann National Reference Centre for Mycobacteria, Borstel Germany
Najib Rahman Experimental Medicine Division Oxford University, Churchill Hospital United Kingdom
Dr Anna Seale Tropical Medicine Oxford University, Kilifi Kenya
James Berkley MBBS MTropMed MRCP MD FRCPCH FMedSci Tropical Medicine Oxford University, Kilifi Kenya
Philip Fowler Experimental Medicine Division Oxford University, John Radcliffe Hospital United Kingdom
Ronald Geskus Tropical Medicine Oxford University, Ho Chi Minh City Vietnam
Jeremy Day Tropical Medicine Oxford University, Ho Chi Minh City Vietnam
Nicholas Day MA BM BCh DM FRCP FMedSci Tropical Medicine Oxford University, Bangkok Thailand
Nicholas White FRS Tropical Medicine Oxford University, Bangkok Thailand
Ross LL, Walker AS, Lou Y, Tenorio AR, Gibb DM, Double J, Gilks C, McCoig CC, Munderi P, Musoro G et al. 2019. Changes over time in creatinine clearance and comparison of emergent adverse events for HIV-positive adults receiving standard doses (300 mg/day) of lamivudine-containing antiretroviral therapy with baseline creatinine clearance of 30-49 vs ≥50 mL/min. PLoS One, 14 (11), pp. e0225199. | Show Abstract | Read more

A retrospective analysis of the randomized controlled DART (Development of AntiRetroviral Therapy in Africa; ISRCTN13968779) trial in HIV-1-positive adults initiating antiretroviral therapy with co-formulated zidovudine/lamivudine plus either tenofovir, abacavir, or nevirapine was conducted to evaluate the safety of initiating standard lamivudine dosing in patients with impaired creatinine clearance (CLcr). Safety data collected through 96 weeks were analyzed after stratification by baseline CLcr (estimated using Cockcroft-Gault) of 30-49 mL/min (n = 168) versus ≥50 mL/min (n = 3,132) and treatment regimen. The Grade 3-4 adverse events (AEs) and serious AEs (for hematological, hepatic and gastrointestinal events), maximal toxicities for liver enzymes, serum creatinine and bilirubin and maximum treatment-emergent hematology toxicities were comparable for groups with baseline CLcr 30-49 versus CLcr≥50 mL/min. No new risks or trends were identified from this dataset. Substantial and similar increases in the mean creatinine clearance (>25 mL/min) were observed from baseline though Week 96 among participants who entered the trial with CLcr 30-49 mL/min, while no increase or smaller median changes in creatinine clearance (<7 mL/min) were observed for participants who entered the trial with CLcr ≥50 mL/min. Substantial increases (> 150 cells/ mm3) in mean CD4+ cells counts from baseline to Week 96 were also observed for participants who entered the trial with CLcr 30-49 mL/min and those with baseline CLcr ≥50 mL/min. Though these results are descriptive, they suggest that HIV-positive patients with CLcr of 30-49 mL/min would have similar AE risks in comparison to patients with CLcr ≥50 mL/min when initiating antiretroviral therapy delivering doses of 300 mg of lamivudine daily through 96 weeks of treatment. Overall improvements in CLcr were observed for patients with baseline CLcr 30-49 mL/min.

Peto L, Fawcett NJ, Crook DW, Peto TEA, Llewelyn MJ, Walker AS. 2019. Selective culture enrichment and sequencing of feces to enhance detection of antimicrobial resistance genes in third-generation cephalosporin resistant Enterobacteriaceae. PLoS One, 14 (11), pp. e0222831. | Show Abstract | Read more

Metagenomic sequencing of fecal DNA can usefully characterise an individual's intestinal resistome but is limited by its inability to detect important pathogens that may be present at low abundance, such as carbapenemase or extended-spectrum beta-lactamase producing Enterobacteriaceae. Here we aimed to develop a hybrid protocol to improve detection of resistance genes in Enterobacteriaceae by using a short period of culture enrichment prior to sequencing of DNA extracted directly from the enriched sample. Volunteer feces were spiked with carbapenemase-producing Enterobacteriaceae and incubated in selective broth culture for 6 hours before sequencing. Different DNA extraction methods were compared, including a plasmid extraction protocol to increase the detection of plasmid-associated resistance genes. Although enrichment prior to sequencing increased the detection of carbapenemase genes, the differing growth characteristics of the spike organisms precluded accurate quantification of their concentration prior to culture. Plasmid extraction increased detection of resistance genes present on plasmids, but the effects were heterogeneous and dependent on plasmid size. Our results demonstrate methods of improving the limit of detection of selected resistance mechanisms in a fecal resistome assay, but they also highlight the difficulties in using these techniques for accurate quantification and should inform future efforts to achieve this goal.

Lewandowski K, Xu Y, Pullan ST, Lumley SF, Foster D, Sanderson N, Vaughan A, Morgan M, Bright N, Kavanagh J et al. 2019. Metagenomic Nanopore Sequencing of Influenza Virus Direct from Clinical Respiratory Samples. J Clin Microbiol, 58 (1), | Show Abstract | Read more

Influenza is a major global public health threat as a result of its highly pathogenic variants, large zoonotic reservoir, and pandemic potential. Metagenomic viral sequencing offers the potential for a diagnostic test for influenza virus which also provides insights on transmission, evolution, and drug resistance and simultaneously detects other viruses. We therefore set out to apply the Oxford Nanopore Technologies sequencing method to metagenomic sequencing of respiratory samples. We generated influenza virus reads down to a limit of detection of 102 to 103 genome copies/ml in pooled samples, observing a strong relationship between the viral titer and the proportion of influenza virus reads (P = 4.7 × 10-5). Applying our methods to clinical throat swabs, we generated influenza virus reads for 27/27 samples with mid-to-high viral titers (cycle threshold [CT ] values, <30) and 6/13 samples with low viral titers (CT values, 30 to 40). No false-positive reads were generated from 10 influenza virus-negative samples. Thus, Nanopore sequencing operated with 83% sensitivity (95% confidence interval [CI], 67 to 93%) and 100% specificity (95% CI, 69 to 100%) compared to the current diagnostic standard. Coverage of full-length virus was dependent on sample composition, being negatively influenced by increased host and bacterial reads. However, at high influenza virus titers, we were able to reconstruct >99% complete sequences for all eight gene segments. We also detected a human coronavirus coinfection in one clinical sample. While further optimization is required to improve sensitivity, this approach shows promise for the Nanopore platform to be used in the diagnosis and genetic analysis of influenza virus and other respiratory viruses.

Eyre DW, Peto TEA, Crook DW, Walker AS, Wilcox MH. 2019. Hash-Based Core Genome Multilocus Sequence Typing for Clostridium difficile. J Clin Microbiol, 58 (1), | Show Abstract | Read more

Pathogen whole-genome sequencing has huge potential as a tool to better understand infection transmission. However, rapidly identifying closely related genomes among a background of thousands of other genomes is challenging. Here, we describe a refinement to core genome multilocus sequence typing (cgMLST) in which alleles at each gene are reproducibly converted to a unique hash, or short string of letters (hash-cgMLST). This avoids the resource-intensive need for a single centralized database of sequentially numbered alleles. We test the reproducibility and discriminatory power of cgMLST/hash-cgMLST compared to those of mapping-based approaches in Clostridium difficile, using repeated sequencing of the same isolates (replicates) and data from consecutive infection isolates from six English hospitals. Hash-cgMLST provided the same results as standard cgMLST, with minimal performance penalty. Comparing 272 replicate sequence pairs using reference-based mapping, there were 0, 1, or 2 single-nucleotide polymorphisms (SNPs) between 262 (96%), 5 (2%), and 1 (<1%) of the pairs, respectively. Using hash-cgMLST, 218 (80%) of replicate pairs assembled with SPAdes had zero gene differences, and 31 (11%), 5 (2%), and 18 (7%) pairs had 1, 2, and >2 differences, respectively. False gene differences were clustered in specific genes and associated with fragmented assemblies, but were reduced using the SKESA assembler. Considering 412 pairs of infections with ≤2 SNPS, i.e., consistent with recent transmission, 376 (91%) had ≤2 gene differences and 16 (4%) had ≥4. Comparing a genome to 100,000 others took <1 min using hash-cgMLST. Hash-cgMLST is an effective surveillance tool for rapidly identifying clusters of related genomes. However, cgMLST/hash-cgMLST generate more false variants than mapping-based approaches. Follow-up mapping-based analyses are likely required to precisely define close genetic relationships.

Gweon HS, Shaw LP, Swann J, De Maio N, Abuoun M, Niehus R, Hubbard ATM, Bowes MJ, Bailey MJ, Peto TEA et al. 2019. The impact of sequencing depth on the inferred taxonomic composition and AMR gene content of metagenomic samples Environmental Microbiomes, 14 (1), | Show Abstract | Read more

© 2019 The Author(s). Background: Shotgun metagenomics is increasingly used to characterise microbial communities, particularly for the investigation of antimicrobial resistance (AMR) in different animal and environmental contexts. There are many different approaches for inferring the taxonomic composition and AMR gene content of complex community samples from shotgun metagenomic data, but there has been little work establishing the optimum sequencing depth, data processing and analysis methods for these samples. In this study we used shotgun metagenomics and sequencing of cultured isolates from the same samples to address these issues. We sampled three potential environmental AMR gene reservoirs (pig caeca, river sediment, effluent) and sequenced samples with shotgun metagenomics at high depth (~ 200 million reads per sample). Alongside this, we cultured single-colony isolates of Enterobacteriaceae from the same samples and used hybrid sequencing (short- A nd long-reads) to create high-quality assemblies for comparison to the metagenomic data. To automate data processing, we developed an open-source software pipeline, 'ResPipe'. Results: Taxonomic profiling was much more stable to sequencing depth than AMR gene content. 1 million reads per sample was sufficient to achieve < 1% dissimilarity to the full taxonomic composition. However, at least 80 million reads per sample were required to recover the full richness of different AMR gene families present in the sample, and additional allelic diversity of AMR genes was still being discovered in effluent at 200 million reads per sample. Normalising the number of reads mapping to AMR genes using gene length and an exogenous spike of Thermus thermophilus DNA substantially changed the estimated gene abundance distributions. While the majority of genomic content from cultured isolates from effluent was recoverable using shotgun metagenomics, this was not the case for pig caeca or river sediment. Conclusions: Sequencing depth and profiling method can critically affect the profiling of polymicrobial animal and environmental samples with shotgun metagenomics. Both sequencing of cultured isolates and shotgun metagenomics can recover substantial diversity that is not identified using the other methods. Particular consideration is required when inferring AMR gene content or presence by mapping metagenomic reads to a database. ResPipe, the open-source software pipeline we have developed, is freely available (https://gitlab.com/hsgweon/ResPipe).

Maitland K, Olupot-Olupot P, Kiguli S, Chagaluka G, Alaroker F, Opoka RO, Mpoya A, Walsh K, Engoru C, Nteziyaremye J et al. 2019. Co-trimoxazole or multivitamin multimineral supplement for post-discharge outcomes after severe anaemia in African children: a randomised controlled trial. Lancet Glob Health, 7 (10), pp. e1435-e1447. | Show Abstract | Read more

BACKGROUND: Severe anaemia is a leading cause of paediatric admission to hospital in Africa; post-discharge outcomes remain poor, with high 6-month mortality (8%) and re-admission (17%). We aimed to investigate post-discharge interventions that might improve outcomes. METHODS: Within the two-stratum, open-label, multicentre, factorial randomised TRACT trial, children aged 2 months to 12 years with severe anaemia, defined as haemoglobin of less than 6 g/dL, at admission to hospital (three in Uganda, one in Malawi) were randomly assigned, using sequentially numbered envelopes linked to a second non-sequentially numbered set of allocations stratified by centre and severity, to enhanced nutritional supplementation with iron and folate-containing multivitamin multimineral supplements versus iron and folate alone at treatment doses (usual care), and to co-trimoxazole versus no co-trimoxazole. All interventions were administered orally and were given for 3 months after discharge from hospital. Separately reported randomisations investigated transfusion management. The primary outcome was 180-day mortality. All analyses were done in the intention-to-treat population; follow-up was 180 days. This trial is registered with the International Standard Randomised Controlled Trial registry, ISRCTN84086586, and follow-up is complete. FINDINGS: From Sept 17, 2014, to May 15, 2017, 3983 eligible children were randomly assigned to treatment, and followed up for 180 days. 164 (4%) were lost to follow-up. 1901 (95%) of 1997 assigned multivitamin multimineral supplement, 1911 (96%) of 1986 assigned iron and folate, and 1922 (96%) of 1994 assigned co-trimoxazole started treatment. By day 180, 166 (8%) children in the multivitamin multimineral supplement group versus 169 (9%) children in the iron and folate group had died (hazard ratio [HR] 0·97, 95% CI 0·79-1·21; p=0·81) and 172 (9%) who received co-trimoxazole versus 163 (8%) who did not receive co-trimoxazole had died (HR 1·07, 95% CI 0·86-1·32; p=0·56). We found no evidence of interactions between these randomisations or with transfusion randomisations (p>0·2). By day 180, 489 (24%) children in the multivitamin multimineral supplement group versus 509 (26%) children in the iron and folate group (HR 0·95, 95% CI 0·84-1·07; p=0·40), and 500 (25%) children in the co-trimoxazole group versus 498 (25%) children in the no co-trimoxazole group (1·01, 0·89-1·15; p=0·85) had had one or more serious adverse events. Most serious adverse events were re-admissions, occurring in 692 (17%) children (175 [4%] with at least two re-admissions). INTERPRETATION: Neither enhanced supplementation with multivitamin multimineral supplement versus iron and folate treatment or co-trimoxazole prophylaxis improved 6-month survival. High rates of hospital re-admission suggest that novel interventions are urgently required for severe anaemia, given the burden it places on overstretched health services in Africa. FUNDING: Medical Research Council and Department for International Development.

Maitland K, Olupot-Olupot P, Walker AS. 2019. Transfusion Timing and Volume in African Children with Severe Anemia. Reply. N Engl J Med, 381 (17), pp. 1687-1688. | Read more

Fawcett N, Young B, Peto L, Quan TP, Gillott R, Wu J, Middlemass C, Weston S, Crook DW, Peto TEA et al. 2019. 'Caveat emptor': the cautionary tale of endocarditis and the potential pitfalls of clinical coding data-an electronic health records study. BMC Med, 17 (1), pp. 169. | Show Abstract | Read more

BACKGROUND: Diagnostic codes from electronic health records are widely used to assess patterns of disease. Infective endocarditis is an uncommon but serious infection, with objective diagnostic criteria. Electronic health records have been used to explore the impact of changing guidance on antibiotic prophylaxis for dental procedures on incidence, but limited data on the accuracy of the diagnostic codes exists. Endocarditis was used as a clinically relevant case study to investigate the relationship between clinical cases and diagnostic codes, to understand discrepancies and to improve design of future studies. METHODS: Electronic health record data from two UK tertiary care centres were linked with data from a prospectively collected clinical endocarditis service database (Leeds Teaching Hospital) or retrospective clinical audit and microbiology laboratory blood culture results (Oxford University Hospitals Trust). The relationship between diagnostic codes for endocarditis and confirmed clinical cases according to the objective Duke criteria was assessed, and impact on estimations of disease incidence and trends. RESULTS: In Leeds 2006-2016, 738/1681(44%) admissions containing any endocarditis code represented a definite/possible case, whilst 263/1001(24%) definite/possible endocarditis cases had no endocarditis code assigned. In Oxford 2010-2016, 307/552(56%) reviewed endocarditis-coded admissions represented a clinical case. Diagnostic codes used by most endocarditis studies had good positive predictive value (PPV) but low sensitivity (e.g. I33-primary 82% and 43% respectively); one (I38-secondary) had PPV under 6%. Estimating endocarditis incidence using raw admission data overestimated incidence trends twofold. Removing records with non-specific codes, very short stays and readmissions improved predictive ability. Estimating incidence of streptococcal endocarditis using secondary codes also overestimated increases in incidence over time. Reasons for discrepancies included changes in coding behaviour over time, and coding guidance allowing assignment of a code mentioning 'endocarditis' where endocarditis was never mentioned in the clinical notes. CONCLUSIONS: Commonly used diagnostic codes in studies of endocarditis had good predictive ability. Other apparently plausible codes were poorly predictive. Use of diagnostic codes without examining sensitivity and predictive ability can give inaccurate estimations of incidence and trends. Similar considerations may apply to other diseases. Health record studies require validation of diagnostic codes and careful data curation to minimise risk of serious errors.

De Maio N, Shaw LP, Hubbard A, George S, Sanderson ND, Swann J, Wick R, AbuOun M, Stubberfield E, Hoosdally SJ et al. 2019. Comparison of long-read sequencing technologies in the hybrid assembly of complex bacterial genomes. Microb Genom, 5 (9), | Show Abstract | Read more

Illumina sequencing allows rapid, cheap and accurate whole genome bacterial analyses, but short reads (<300 bp) do not usually enable complete genome assembly. Long-read sequencing greatly assists with resolving complex bacterial genomes, particularly when combined with short-read Illumina data (hybrid assembly). However, it is not clear how different long-read sequencing methods affect hybrid assembly accuracy. Relative automation of the assembly process is also crucial to facilitating high-throughput complete bacterial genome reconstruction, avoiding multiple bespoke filtering and data manipulation steps. In this study, we compared hybrid assemblies for 20 bacterial isolates, including two reference strains, using Illumina sequencing and long reads from either Oxford Nanopore Technologies (ONT) or SMRT Pacific Biosciences (PacBio) sequencing platforms. We chose isolates from the family Enterobacteriaceae, as these frequently have highly plastic, repetitive genetic structures, and complete genome reconstruction for these species is relevant for a precise understanding of the epidemiology of antimicrobial resistance. We de novo assembled genomes using the hybrid assembler Unicycler and compared different read processing strategies, as well as comparing to long-read-only assembly with Flye followed by short-read polishing with Pilon. Hybrid assembly with either PacBio or ONT reads facilitated high-quality genome reconstruction, and was superior to the long-read assembly and polishing approach evaluated with respect to accuracy and completeness. Combining ONT and Illumina reads fully resolved most genomes without additional manual steps, and at a lower consumables cost per isolate in our setting. Automated hybrid assembly is a powerful tool for complete and accurate bacterial genome assembly.

Schweitzer VA, van Werkhoven CH, Rodríguez Baño J, Bielicki J, Harbarth S, Hulscher M, Huttner B, Islam J, Little P, Pulcini C et al. 2020. Optimizing design of research to evaluate antibiotic stewardship interventions: consensus recommendations of a multinational working group. Clin Microbiol Infect, 26 (1), pp. 41-50. | Show Abstract | Read more

BACKGROUND: Antimicrobial stewardship interventions and programmes aim to ensure effective treatment while minimizing antimicrobial-associated harms including resistance. Practice in this vital area is undermined by the poor quality of research addressing both what specific antimicrobial use interventions are effective and how antimicrobial use improvement strategies can be implemented into practice. In 2016 we established a working party to identify the key design features that limit translation of existing research into practice and then to make recommendations for how future studies in this field should be optimally designed. The first part of this work has been published as a systematic review. Here we present the working group's final recommendations. METHODS: An international working group for design of antimicrobial stewardship intervention evaluations was convened in response to the fourth call for leading expert network proposals by the Joint Programming Initiative on Antimicrobial Resistance (JPIAMR). The group comprised clinical and academic specialists in antimicrobial stewardship and clinical trial design from six European countries. Group members completed a structured questionnaire to establish the scope of work and key issues to develop ahead of a first face-to-face meeting that (a) identified the need for a comprehensive systematic review of study designs in the literature and (b) prioritized key areas where research design considerations restrict translation of findings into practice. The working group's initial outputs were reviewed by independent advisors and additional expertise was sought in specific clinical areas. At a second face-to-face meeting the working group developed a theoretical framework and specific recommendations to support optimal study design. These were finalized by the working group co-ordinators and agreed by all working group members. RESULTS: We propose a theoretical framework in which consideration of the intervention rationale the intervention setting, intervention features and the intervention aims inform selection and prioritization of outcome measures, whether the research sets out to determine superiority or non-inferiority of the intervention measured by its primary outcome(s), the most appropriate study design (e.g. experimental or quasi- experimental) and the detailed design features. We make 18 specific recommendation in three domains: outcomes, objectives and study design. CONCLUSIONS: Researchers, funders and practitioners will be able to draw on our recommendations to most efficiently evaluate antimicrobial stewardship interventions.

Santillo M, Sivyer K, Krusche A, Mowbray F, Jones N, Peto TEA, Walker AS, Llewelyn MJ, Yardley L, ARK-Hospital. 2019. Intervention planning for Antibiotic Review Kit (ARK): a digital and behavioural intervention to safely review and reduce antibiotic prescriptions in acute and general medicine. J Antimicrob Chemother, 74 (11), pp. 3362-3370. | Show Abstract | Read more

BACKGROUND: Hospital antimicrobial stewardship strategies, such as 'Start Smart, Then Focus' in the UK, balance the need for prompt, effective antibiotic treatment with the need to limit antibiotic overuse using 'review and revise'. However, only a minority of review decisions are to stop antibiotics. Research suggests that this is due to both behavioural and organizational factors. OBJECTIVES: To develop and optimize the Antibiotic Review Kit (ARK) intervention. ARK is a complex digital, organizational and behavioural intervention that supports implementation of 'review and revise' to help healthcare professionals safely stop unnecessary antibiotics. METHODS: A theory-, evidence- and person-based approach was used to develop and optimize ARK and its implementation. This was done through iterative stakeholder consultation and in-depth qualitative research with doctors, nurses and pharmacists in UK hospitals. Barriers to and facilitators of the intervention and its implementation, and ways to address them, were identified and then used to inform the intervention's development. RESULTS: A key barrier to stopping antibiotics was reportedly a lack of information about the original prescriber's rationale for and their degree of certainty about the need for antibiotics. An integral component of ARK was the development and optimization of a Decision Aid and its implementation to increase transparency around initial prescribing decisions. CONCLUSIONS: The key output of this research is a digital and behavioural intervention targeting important barriers to stopping antibiotics at review (see http://bsac-vle.com/ark-the-antibiotic-review-kit/ and http://antibioticreviewkit.org.uk/). ARK will be evaluated in a feasibility study and, if successful, a stepped-wedge cluster-randomized controlled trial at acute hospitals across the NHS.

Szubert A, Bailey SL, Cooke GS, Peto T, Llewelyn MJ, Edgeworth JD, Walker AS, Thwaites GE, United Kingdom Clinical Infection Research Group (UKCIRG). 2019. Predictors of recurrence, early treatment failure and death from Staphylococcus aureus bacteraemia: Observational analyses within the ARREST trial. J Infect, 79 (4), pp. 332-340. | Show Abstract | Read more

OBJECTIVES: Adjunctive rifampicin did not reduce failure/recurrence/death as a composite endpoint in the ARREST trial of Staphylococcus aureus bacteraemia, but did reduce recurrences. We investigated clinically-defined 14-day treatment failure, and recurrence and S. aureus-attributed/unattributed mortality by 12-weeks to further define their predictors. METHODS: A post-hoc exploratory analysis using competing risks models was conducted to identify sub-groups which might benefit from rifampicin. A points-based recurrence risk score was developed and used to compare rifampicin's benefits. RESULTS: Recurrence was strongly associated with liver and renal failure, diabetes and immune-suppressive drugs (p < 0.005); in contrast, failure and S. aureus-attributed mortality were associated with older age and higher neutrophil counts. Higher SOFA scores predicted mortality; higher Charlson scores and deep-seated initial infection focus predicted failure. Unexpectedly, recurrence risk increased with increasing BMI in placebo (p = 0.04) but not rifampicin (p = 0.60) participants (pheterogeneity = 0.06). A persistent focus was judged the primary reason for recurrence in 23(74%). A 5-factor risk score based on BMI, Immunosuppression, Renal disease, Diabetes, Liver disease (BIRDL) strongly predicted recurrence (p < 0.001). CONCLUSIONS: Rifampicin reduces recurrences overall; those with greatest absolute risk reductions were identified using a simple risk score. Source control and adequate duration of antibiotic therapy remain essential to prevent recurrence and improve outcomes.

Cross ELA, Sivyer K, Islam J, Santillo M, Mowbray F, Peto TEA, Walker AS, Yardley L, Llewelyn MJ. 2019. Adaptation and implementation of the ARK (Antibiotic Review Kit) intervention to safely and substantially reduce antibiotic use in hospitals: a feasibility study. J Hosp Infect, 103 (3), pp. 268-275. | Show Abstract | Read more

BACKGROUND: Antimicrobial stewardship initiatives in secondary care depend on clinicians undertaking antibiotic prescription reviews but decisions to limit antibiotic treatment at review are complex. AIM: To assess the feasibility and acceptability of implementing ARK (Antibiotic Review Kit), a behaviour change intervention made up of four components (brief online tool, prescribing decision aid, regular data collection and feedback process, and patient leaflet) to support stopping antibiotic treatment when it is safe to do so among hospitalized patients; before definitive evaluation through a stepped-wedge cluster-randomized controlled trial. METHODS: Acceptability of the different intervention elements was assessed for a period of 12 weeks by uptake of the online tool, adoption of the decision aid into prescribing practice, and rates of decisions to stop antibiotics at review (assessed through repeated point-prevalence surveys). Patient perceptions of the information leaflet were assessed through a brief questionnaire. FINDINGS: All elements of the intervention were successfully introduced into practice. A total of 132 staff encompassing a broad range of prescribers and non-prescribers completed the online tool (19.4 per 100 acute beds), including 97% (32/33) of the pre-specified essential clinical staff. Among 588 prescription charts evaluated in seven point-prevalence surveys over the 12-week implementation period, 82% overall (76-90% at each survey) used the decision aid. The median antibiotic stop rate post implementation was 36% (range: 29-40% at each survey) compared with 9% pre implementation (P < 0.001). CONCLUSION: ARK provides a feasible and acceptable mechanism to support stopping antibiotics safely at post-prescription reviews in an acute hospital setting.

Maitland K, Olupot-Olupot P, Kiguli S, Chagaluka G, Alaroker F, Opoka RO, Mpoya A, Engoru C, Nteziyaremye J, Mallewa M et al. 2019. Transfusion Volume for Children with Severe Anemia in Africa. N Engl J Med, 381 (5), pp. 420-431. | Show Abstract | Read more

BACKGROUND: Severe anemia (hemoglobin level, <6 g per deciliter) is a leading cause of hospital admission and death in children in sub-Saharan Africa. The World Health Organization recommends transfusion of 20 ml of whole-blood equivalent per kilogram of body weight for anemia, regardless of hemoglobin level. METHODS: In this factorial, open-label trial, we randomly assigned Ugandan and Malawian children 2 months to 12 years of age with a hemoglobin level of less than 6 g per deciliter and severity features (e.g., respiratory distress or reduced consciousness) to receive immediate blood transfusion with 20 ml per kilogram or 30 ml per kilogram. Three other randomized analyses investigated immediate as compared with no immediate transfusion, the administration of postdischarge micronutrients, and postdischarge prophylaxis with trimethoprim-sulfamethoxazole. The primary outcome was 28-day mortality. RESULTS: A total of 3196 eligible children (median age, 37 months; 2050 [64.1%] with malaria) were assigned to receive a transfusion of 30 ml per kilogram (1598 children) or 20 ml per kilogram (1598 children) and were followed for 180 days. A total of 1592 children (99.6%) in the higher-volume group and 1596 (99.9%) in the lower-volume group started transfusion (median, 1.2 hours after randomization). The mean (±SD) volume of total blood transfused per child was 475±385 ml and 353±348 ml, respectively; 197 children (12.3%) and 300 children (18.8%) in the respective groups received additional transfusions. Overall, 55 children (3.4%) in the higher-volume group and 72 (4.5%) in the lower-volume group died before 28 days (hazard ratio, 0.76; 95% confidence interval [CI], 0.54 to 1.08; P = 0.12 by log-rank test). This finding masked significant heterogeneity in 28-day mortality according to the presence or absence of fever (>37.5°C) at screening (P=0.001 after Sidak correction). Among the 1943 children (60.8%) without fever, mortality was lower with a transfusion volume of 30 ml per kilogram than with a volume of 20 ml per kilogram (hazard ratio, 0.43; 95% CI, 0.27 to 0.69). Among the 1253 children (39.2%) with fever, mortality was higher with 30 ml per kilogram than with 20 ml per kilogram (hazard ratio, 1.91; 95% CI, 1.04 to 3.49). There was no evidence of differences between the randomized groups in readmissions, serious adverse events, or hemoglobin recovery at 180 days. CONCLUSIONS: Overall mortality did not differ between the two transfusion strategies. (Funded by the Medical Research Council and Department for International Development, United Kingdom; TRACT Current Controlled Trials number, ISRCTN84086586.).

Maitland K, Kiguli S, Olupot-Olupot P, Engoru C, Mallewa M, Saramago Goncalves P, Opoka RO, Mpoya A, Alaroker F, Nteziyaremye J et al. 2019. Immediate Transfusion in African Children with Uncomplicated Severe Anemia. N Engl J Med, 381 (5), pp. 407-419. | Show Abstract | Read more

BACKGROUND: The World Health Organization recommends not performing transfusions in African children hospitalized for uncomplicated severe anemia (hemoglobin level of 4 to 6 g per deciliter and no signs of clinical severity). However, high mortality and readmission rates suggest that less restrictive transfusion strategies might improve outcomes. METHODS: In this factorial, open-label, randomized, controlled trial, we assigned Ugandan and Malawian children 2 months to 12 years of age with uncomplicated severe anemia to immediate transfusion with 20 ml or 30 ml of whole-blood equivalent per kilogram of body weight, as determined in a second simultaneous randomization, or no immediate transfusion (control group), in which transfusion with 20 ml of whole-blood equivalent per kilogram was triggered by new signs of clinical severity or a drop in hemoglobin to below 4 g per deciliter. The primary outcome was 28-day mortality. Three other randomizations investigated transfusion volume, postdischarge supplementation with micronutrients, and postdischarge prophylaxis with trimethoprim-sulfamethoxazole. RESULTS: A total of 1565 children (median age, 26 months) underwent randomization, with 778 assigned to the immediate-transfusion group and 787 to the control group; 984 children (62.9%) had malaria. The children were followed for 180 days, and 71 (4.5%) were lost to follow-up. During the primary hospitalization, transfusion was performed in all the children in the immediate-transfusion group and in 386 (49.0%) in the control group (median time to transfusion, 1.3 hours vs. 24.9 hours after randomization). The mean (±SD) total blood volume transfused per child was 314±228 ml in the immediate-transfusion group and 142±224 ml in the control group. Death had occurred by 28 days in 7 children (0.9%) in the immediate-transfusion group and in 13 (1.7%) in the control group (hazard ratio, 0.54; 95% confidence interval [CI], 0.22 to 1.36; P = 0.19) and by 180 days in 35 (4.5%) and 47 (6.0%), respectively (hazard ratio, 0.75; 95% CI, 0.48 to 1.15), without evidence of interaction with other randomizations (P>0.20) or evidence of between-group differences in readmissions, serious adverse events, or hemoglobin recovery at 180 days. The mean length of hospital stay was 0.9 days longer in the control group. CONCLUSIONS: There was no evidence of differences in clinical outcomes over 6 months between the children who received immediate transfusion and those who did not. The triggered-transfusion strategy in the control group resulted in lower blood use; however, the length of hospital stay was longer, and this strategy required clinical and hemoglobin monitoring. (Funded by the Medical Research Council and Department for International Development; TRACT Current Controlled Trials number, ISRCTN84086586.).

Scarborough M, Li HK, Rombach I, Zambellas R, Walker AS, McNally M, Atkins B, Kümin M, Lipsky BA, Hughes H et al. 2019. Oral versus intravenous antibiotics for bone and joint infections: the OVIVA non-inferiority RCT. Health Technol Assess, 23 (38), pp. 1-92. | Show Abstract | Read more

BACKGROUND: Management of bone and joint infection commonly includes 4-6 weeks of intravenous (IV) antibiotics, but there is little evidence to suggest that oral (PO) therapy results in worse outcomes. OBJECTIVE: To determine whether or not PO antibiotics are non-inferior to IV antibiotics in treating bone and joint infection. DESIGN: Parallel-group, randomised (1 : 1), open-label, non-inferiority trial. The non-inferiority margin was 7.5%. SETTING: Twenty-six NHS hospitals. PARTICIPANTS: Adults with a clinical diagnosis of bone, joint or orthopaedic metalware-associated infection who would ordinarily receive at least 6 weeks of antibiotics, and who had received ≤ 7 days of IV therapy from definitive surgery (or start of planned curative treatment in patients managed non-operatively). INTERVENTIONS: Participants were centrally computer-randomised to PO or IV antibiotics to complete the first 6 weeks of therapy. Follow-on PO therapy was permitted in either arm. MAIN OUTCOME MEASURE: The primary outcome was the proportion of participants experiencing treatment failure within 1 year. An associated cost-effectiveness evaluation assessed health resource use and quality-of-life data. RESULTS: Out of 1054 participants (527 in each arm), end-point data were available for 1015 (96.30%) participants. Treatment failure was identified in 141 out of 1015 (13.89%) participants: 74 out of 506 (14.62%) and 67 out of 509 (13.16%) of those participants randomised to IV and PO therapy, respectively. In the intention-to-treat analysis, using multiple imputation to include all participants, the imputed risk difference between PO and IV therapy for definitive treatment failure was -1.38% (90% confidence interval -4.94% to 2.19%), thus meeting the non-inferiority criterion. A complete-case analysis, a per-protocol analysis and sensitivity analyses for missing data each confirmed this result. With the exception of IV catheter complications [49/523 (9.37%) in the IV arm vs. 5/523 (0.96%) in the PO arm)], there was no significant difference between the two arms in the incidence of serious adverse events. PO therapy was highly cost-effective, yielding a saving of £2740 per patient without any significant difference in quality-adjusted life-years between the two arms of the trial. LIMITATIONS: The OVIVA (Oral Versus IntraVenous Antibiotics) trial was an open-label trial, but bias was limited by assessing all potential end points by a blinded adjudication committee. The population was heterogenous, which facilitated generalisability but limited the statistical power of subgroup analyses. Participants were only followed up for 1 year so differences in late recurrence cannot be excluded. CONCLUSIONS: PO antibiotic therapy is non-inferior to IV therapy when used during the first 6 weeks in the treatment for bone and joint infection, as assessed by definitive treatment failure within 1 year of randomisation. These findings challenge the current standard of care and provide an opportunity to realise significant benefits for patients, antimicrobial stewardship and the health economy. FUTURE WORK: Further work is required to define the optimal total duration of therapy for bone and joint infection in the context of specific surgical interventions. Currently, wide variation in clinical practice suggests significant redundancy that likely contributes to the excess and unnecessary use of antibiotics. TRIAL REGISTRATION: Current Controlled Trials ISRCTN91566927. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 23, No. 38. See the NIHR Journals Library website for further project information.

Donker T, Smieszek T, Henderson KL, Walker TM, Hope R, Johnson AP, Woodford N, Crook DW, Peto TEA, Walker AS, Robotham JV. 2019. Using hospital network-based surveillance for antimicrobial resistance as a more robust alternative to self-reporting. PLoS One, 14 (7), pp. e0219994. | Show Abstract | Read more

Hospital performance is often measured using self-reported statistics, such as the incidence of hospital-transmitted micro-organisms or those exhibiting antimicrobial resistance (AMR), encouraging hospitals with high levels to improve their performance. However, hospitals that increase screening efforts will appear to have a higher incidence and perform poorly, undermining comparison between hospitals and disincentivising testing, thus hampering infection control. We propose a surveillance system in which hospitals test patients previously discharged from other hospitals and report observed cases. Using English National Health Service (NHS) Hospital Episode Statistics data, we analysed patient movements across England and assessed the number of hospitals required to participate in such a reporting scheme to deliver robust estimates of incidence. With over 1.2 million admissions to English hospitals previously discharged from other hospitals annually, even when only a fraction of hospitals (41/155) participate (each screening at least 1000 of these admissions), the proposed surveillance system can estimate incidence across all hospitals. By reporting on other hospitals, the reporting of incidence is separated from the task of improving own performance. Therefore the incentives for increasing performance can be aligned to increase (rather than decrease) screening efforts, thus delivering both more comparable figures on the AMR problems across hospitals and improving infection control efforts.

Walker AS, Budgell E, Laskawiec-Szkonter M, Sivyer K, Wordsworth S, Quaddy J, Santillo M, Krusche A, Roope LSJ, Bright N et al. 2019. Antibiotic Review Kit for Hospitals (ARK-Hospital): study protocol for a stepped-wedge cluster-randomised controlled trial. Trials, 20 (1), pp. 421. | Show Abstract | Read more

BACKGROUND: To ensure patients continue to get early access to antibiotics at admission, while also safely reducing antibiotic use in hospitals, one needs to target the continued need for antibiotics as more diagnostic information becomes available. UK Department of Health guidance promotes an initiative called 'Start Smart then Focus': early effective antibiotics followed by active 'review and revision' 24-72 h later. However in 2017, < 10% of antibiotic prescriptions were discontinued at review, despite studies suggesting that 20-30% of prescriptions could be stopped safely. METHODS/DESIGN: Antibiotic Review Kit for Hospitals (ARK-Hospital) is a complex 'review and revise' behavioural intervention targeting healthcare professionals involved in antibiotic prescribing or administration in inpatients admitted to acute/general medicine (the largest consumers of non-prophylactic antibiotics in hospitals). The primary study objective is to evaluate whether ARK-Hospital can safely reduce the total antibiotic burden in acute/general medical inpatients by at least 15%. The primary hypotheses are therefore that the introduction of the behavioural intervention will be non-inferior in terms of 30-day mortality post-admission (relative margin 5%) for an acute/general medical inpatient, and superior in terms of defined daily doses of antibiotics per acute/general medical admission (co-primary outcomes). The unit of observation is a hospital organisation, a single hospital or group of hospitals organised with one executive board and governance framework (National Health Service trusts in England; health boards in Northern Ireland, Wales and Scotland). The study comprises a feasibility study in one organisation (phase I), an internal pilot trial in three organisations (phase II) and a cluster (organisation)-randomised stepped-wedge trial (phase III) targeting a minimum of 36 organisations in total. Randomisation will occur over 18 months from November 2017 with a further 12 months follow-up to assess sustainability. The behavioural intervention will be delivered to healthcare professionals involved in antibiotic prescribing or administration in adult inpatients admitted to acute/general medicine. Outcomes will be assessed in adult inpatients admitted to acute/general medicine, collected through routine electronic health records in all patients. DISCUSSION: ARK-Hospital aims to provide a feasible, sustainable and generalisable mechanism for increasing antibiotic stopping in patients who no longer need to receive them at 'review and revise'. TRIAL REGISTRATION: ISRCTN Current Controlled Trials, ISRCTN12674243 . Registered on 10 April 2017.

Horne R, Glendinning E, King K, Chalder T, Sabin C, Walker AS, Campbell LJ, Mosweu I, Anderson J, Collins S et al. 2019. Protocol of a two arm randomised, multi-centre, 12-month controlled trial: evaluating the impact of a Cognitive Behavioural Therapy (CBT)-based intervention Supporting UPtake and Adherence to antiretrovirals (SUPA) in adults with HIV. BMC Public Health, 19 (1), pp. 905. | Show Abstract | Read more

BACKGROUND: Delay to start antiretroviral therapy (ART) and nonadherence compromise the health and wellbeing of people living with HIV (PLWH), raise the cost of care and increase risk of transmission to sexual partners. To date, interventions to improve adherence to ART have had limited success, perhaps because they have failed to systematically elicit and address both perceptual and practical barriers to adherence. The primary aim of this study is to determine the efficacy of the Supporting UPtake and Adherence (SUPA) intervention. METHODS: This study comprises 2 phases. Phase 1 is an observational cohort study, in which PLWH who are ART naïve and recommended to take ART by their clinician complete a questionnaire assessing their beliefs about ART over 12 months. Phase 2 is a randomised controlled trial (RCT) nested within the observational cohort study to investigate the effectiveness of the SUPA intervention on adherence to ART. PLWH at risk of nonadherence (based on their beliefs about ART) will be recruited and randomised 1:1 to the intervention (SUPA intervention + usual care) and control (usual care) arms. The SUPA intervention involves 4 tailored treatment support sessions delivered by a Research Nurse utilising a collaborative Cognitive Behavioural Therapy (CBT) and Motivational Interviewing (MI) approach. Sessions are tailored to individual needs and preferences based on the individual patient's perceptions and practical barriers to ART. An animation series and intervention manual have been developed to communicate a rationale for the personal necessity for ART and illustrate concerns and potential solutions. The primary outcome is adherence to ART measured using Medication Event Monitoring System (MEMS). Three hundred seventy-two patients will be sufficient to detect a 15% difference in adherence with 80% power and an alpha of 0.05. Costs will be compared between intervention and control groups. Costs will be combined with the primary outcome in cost-effectiveness analyses. Quality adjusted life-years (QALYs) will also be estimated over the follow-up period and used in the analyses. DISCUSSION: The findings will enable patients, healthcare providers and policy makers to make informed decisions about the value of the SUPA intervention. TRIAL REGISTRATION: The trial was retrospectively registered 21/02/2014, ISRCTN35514212 .

Pouwels KB, Yin M, Butler CC, Cooper BS, Wordsworth S, Walker AS, Robotham JV. 2019. Optimising trial designs to identify appropriate antibiotic treatment durations. BMC Med, 17 (1), pp. 115. | Show Abstract | Read more

BACKGROUND: For many infectious conditions, the optimal antibiotic course length remains unclear. The estimation of course length must consider the important trade-off between maximising short- and long-term efficacy and minimising antibiotic resistance and toxicity. MAIN BODY: Evidence on optimal treatment durations should come from randomised controlled trials. However, most antibiotic randomised controlled trials compare two arbitrarily chosen durations. We argue that alternative trial designs, which allow allocation of patients to multiple different treatment durations, are needed to better identify optimal antibiotic durations. There are important considerations when deciding which design is most useful in identifying optimal treatment durations, including the ability to model the duration-response relationship (or duration-response 'curve'), the risk of allocation concealment bias, statistical efficiency, the possibility to rapidly drop arms that are clearly inferior, and the possibility of modelling the trade-off between multiple competing outcomes. CONCLUSION: Multi-arm designs modelling duration-response curves with the possibility to drop inferior arms during the trial could provide more information about the optimal duration of antibiotic therapies than traditional head-to-head comparisons of limited numbers of durations, while minimising the probability of assigning trial participants to an ineffective treatment regimen.

Shi Y, Thompson J, Walker AS, Paton NI, Cheung YB, EARNEST Trial Team. 2019. Mapping the medical outcomes study HIV health survey (MOS-HIV) to the EuroQoL 5 Dimension (EQ-5D-3 L) utility index. Health Qual Life Outcomes, 17 (1), pp. 83. | Show Abstract | Read more

BACKGROUND: Mapping of health-related quality-of-life measures to health utility values can facilitate cost-utility evaluation. Regression-based methods tend to lead to shrinkage of variance. This study aims to map the Medical Outcomes Study HIV Health Survey (MOS-HIV) to EuroQoL 5 Dimensions (EQ-5D-3 L) utility index, and to characterize the performance of three mapping methods, including ordinary least squares (OLS), equi-percentile method (EPM), and a recently proposed method called Mean Rank Method (MRM). METHODS: This is a secondary analysis of data from a randomized HIV treatment trial. Baseline data from 421 participants were used to develop mapping functions. Follow-up data from 236 participants was used to validate the mapping functions. RESULTS: In the training dataset, MRM and OLS, but not EPM, reproduced the observed mean utility (0.731). MRM, OLS and EPM under-estimated the standard deviation by 0.3, 26.6 and 1.7%, respectively. MRM had the lowest mean absolute error (0.143) and highest intraclass correlation coefficient (0.723) with the observed utility values, whereas OLS had the lowest mean squared error (0.038) and highest R-squared (0.542). Regressing the MRM- and OLS-mapped utility values upon body mass index and log-viral load gave covariate associations comparable to those estimated from the observed utility data (all P > 0.10). EPM did not achieve this property. Findings from the validation data were similar. CONCLUSIONS: Functions are available for mapping the MOS-HIV to the EQ-5D-3 L utility values. MRM and OLS were comparable in terms of agreement with the observed utility values at the individual level. MRM had better performance at the group level in terms of describing the utility distribution. TRIAL REGISTRATION: NCT00988039 . Registered 30 September 2009.

Eyre DW, Didelot X, Buckley AM, Freeman J, Moura IB, Crook DW, Peto TEA, Walker AS, Wilcox MH, Dingle KE. 2019. Clostridium difficile trehalose metabolism variants are common and not associated with adverse patient outcomes when variably present in the same lineage. EBioMedicine, 43 pp. 347-355. | Show Abstract | Read more

BACKGROUND: Clostridium difficile ribotype-027, ribotype-078, and ribotype-017 are virulent and epidemic lineages. Trehalose metabolism variants in these ribotypes, combined with increased human trehalose consumption, have been hypothesised to have contributed to their emergence and virulence. METHODS: 5232 previously whole-genome sequenced C. difficile isolates were analysed. Clinical isolates were used to investigate the impact of trehalose metabolism variants on mortality. Import data were used to estimate changes in dietary trehalose. Ribotype-027 virulence was investigated in a clinically reflective gut model. FINDINGS: Trehalose metabolism variants found in ribotype-027 and ribotype-017 were widely distributed throughout C. difficile clade-2 and clade-4 in 24/29 (83%) and 10/11 (91%) of sequence types (STs), respectively. The four-gene trehalose metabolism cluster described in ribotype-078 was common in genomes from all five clinically-important C. difficile clades (40/167 [24%] STs). The four-gene cluster was variably present in 208 ribotype-015 infections (98 [47%]); 27/208 (13%) of these patients died within 30-days of diagnosis. Adjusting for age, sex, and infecting ST, there was no association between 30-day all-cause mortality and the four-gene cluster (OR 0.36 [95%CI 0.09-1.34, p = 0.13]). Synthetic trehalose imports in the USA, UK, Germany and the EU were  < 1 g/capita/year during 2000-2006, and  < 9 g/capita/year 2007-2012, compared with dietary trehalose from natural sources of ~100 g/capita/year. Trehalose supplementation did not increase ribotype-027 virulence in a clinically-validated gut model. INTERPRETATION: Trehalose metabolism variants are common in C. difficile. Increases in total dietary trehalose during the early-mid 2000s C. difficile epidemic were likely relatively minimal. Alternative explanations are required to explain why ribotype-027, ribotype-078 and ribotype-017 have been successful. FUNDING: National Institute for Health Research. Gut model experiments only: Hayashibara Co. Ltd.

Rooney CM, Sheppard AE, Clark E, Davies K, Hubbard ATM, Sebra R, Crook DW, Walker AS, Wilcox MH, Chilton CH. 2019. Dissemination of multiple carbapenem resistance genes in an in vitro gut model simulating the human colon. J Antimicrob Chemother, 74 (7), pp. 1876-1883. | Show Abstract | Read more

BACKGROUND: Carbapenemase-producing Enterobacteriaceae (CPE) pose a major global health risk. Mobile genetic elements account for much of the increasing CPE burden. OBJECTIVES: To investigate CPE colonization and the impact of antibiotic exposure on subsequent resistance gene dissemination within the gut microbiota using a model to simulate the human colon. METHODS: Gut models seeded with CPE-negative human faeces [screened with BioMérieux chromID® CARBA-SMART (Carba-Smart), Cepheid Xpert® Carba-R assay (XCR)] were inoculated with distinct carbapenemase-producing Klebsiella pneumoniae strains (KPC, NDM) and challenged with imipenem or piperacillin/tazobactam then meropenem. Resistant populations were enumerated daily on selective agars (Carba-Smart); CPE genes were confirmed by PCR (XCR, Check-Direct CPE Screen for BD MAX™). CPE gene dissemination was tracked using PacBio long-read sequencing. RESULTS: CPE populations increased during inoculation, plateauing at ∼105 log10 cfu/mL in both models and persisting throughout the experiments (>65 days), with no evidence of CPE 'washout'. After antibiotic administration, there was evidence of interspecies plasmid transfer of blaKPC-2 (111742 bp IncFII/IncR plasmid, 99% identity to pKpQIL-D2) and blaNDM-1 (∼170 kb IncFIB/IncFII plasmid), and CPE populations rose from <0.01% to >45% of the total lactose-fermenting populations in the KPC model. Isolation of a blaNDM-1K. pneumoniae with one chromosomal single-nucleotide variant compared with the inoculated strain indicated clonal expansion within the model. Antibiotic administration exposed a previously undetected K. pneumoniae encoding blaOXA-232 (KPC model). CONCLUSIONS: CPE exposure can lead to colonization, clonal expansion and resistance gene transfer within intact human colonic microbiota. Furthermore, under antibiotic selective pressure, new resistant populations emerge, emphasizing the need to control exposure to antimicrobials.

Roope LSJ, Smith RD, Pouwels KB, Buchanan J, Abel L, Eibich P, Butler CC, Tan PS, Walker AS, Robotham JV, Wordsworth S. 2019. The challenge of antimicrobial resistance: What economics can contribute. Science, 364 (6435), pp. 41-+. | Show Abstract | Read more

As antibiotic consumption grows, bacteria are becoming increasingly resistant to treatment. Antibiotic resistance undermines much of modern health care, which relies on access to effective antibiotics to prevent and treat infections associated with routine medical procedures. The resulting challenges have much in common with those posed by climate change, which economists have responded to with research that has informed and shaped public policy. Drawing on economic concepts such as externalities and the principal-agent relationship, we suggest how economics can help to solve the challenges arising from increasing resistance to antibiotics. We discuss solutions to the key economic issues, from incentivizing the development of effective new antibiotics to improving antibiotic stewardship through financial mechanisms and regulation.

Bourke CD, Gough EK, Pimundu G, Shonhai A, Berejena C, Terry L, Baumard L, Choudhry N, Karmali Y, Bwakura-Dangarembizi M et al. 2019. Cotrimoxazole reduces systemic inflammation in HIV infection by altering the gut microbiome and immune activation. Sci Transl Med, 11 (486), | Show Abstract | Read more

Long-term cotrimoxazole prophylaxis reduces mortality and morbidity in HIV infection, but the mechanisms underlying these clinical benefits are unclear. Here, we investigate the impact of cotrimoxazole on systemic inflammation, an independent driver of HIV mortality. In HIV-positive Ugandan and Zimbabwean children receiving antiretroviral therapy, we show that plasma inflammatory markers were lower after randomization to continue (n = 144) versus stop (n = 149) cotrimoxazole. This was not explained by clinical illness, HIV progression, or nutritional status. Because subclinical enteropathogen carriage and enteropathy can drive systemic inflammation, we explored cotrimoxazole effects on the gut microbiome and intestinal inflammatory biomarkers. Although global microbiome composition was unchanged, viridans group Streptococci and streptococcal mevalonate pathway enzymes were lower among children continuing (n = 36) versus stopping (n = 36) cotrimoxazole. These changes were associated with lower fecal myeloperoxidase. To isolate direct effects of cotrimoxazole on immune activation from antibiotic effects, we established in vitro models of systemic and intestinal inflammation. In vitro cotrimoxazole had modest but consistent inhibitory effects on proinflammatory cytokine production by blood leukocytes from HIV-positive (n = 16) and HIV-negative (n = 8) UK adults and reduced IL-8 production by gut epithelial cell lines. Collectively we demonstrate that cotrimoxazole reduces systemic and intestinal inflammation both indirectly via antibiotic effects on the microbiome and directly by blunting immune and epithelial cell activation. Synergy between these pathways may explain the clinical benefits of cotrimoxazole despite high antimicrobial resistance, providing further rationale for extending coverage among people living with HIV in sub-Saharan Africa.

Mathers AJ, Crook D, Vaughan A, Barry KE, Vegesana K, Stoesser N, Parikh HI, Sebra R, Kotay S, Walker AS, Sheppard AE. 2019. Klebsiella quasipneumoniae Provides a Window into Carbapenemase Gene Transfer, Plasmid Rearrangements, and Patient Interactions with the Hospital Environment. Antimicrob Agents Chemother, 63 (6), | Show Abstract | Read more

Several emerging pathogens have arisen as a result of selection pressures exerted by modern health care. Klebsiella quasipneumoniae was recently defined as a new species, yet its prevalence, niche, and propensity to acquire antimicrobial resistance genes are not fully described. We have been tracking inter- and intraspecies transmission of the Klebsiella pneumoniae carbapenemase (KPC) gene, blaKPC, between bacteria isolated from a single institution. We applied a combination of Illumina and PacBio whole-genome sequencing to identify and compare K. quasipneumoniae from patients and the hospital environment over 10- and 5-year periods, respectively. There were 32 blaKPC-positive K. quasipneumoniae isolates, all of which were identified as K. pneumoniae in the clinical microbiology laboratory, from 8 patients and 11 sink drains, with evidence for seven separate blaKPC plasmid acquisitions. Analysis of a single subclade of K. quasipneumoniae subsp. quasipneumoniae (n = 23 isolates) from three patients and six rooms demonstrated seeding of a sink by a patient, subsequent persistence of the strain in the hospital environment, and then possible transmission to another patient. Longitudinal analysis of this strain demonstrated the acquisition of two unique blaKPC plasmids and then subsequent within-strain genetic rearrangement through transposition and homologous recombination. Our analysis highlights the apparent molecular propensity of K. quasipneumoniae to persist in the environment as well as acquire carbapenemase plasmids from other species and enabled an assessment of the genetic rearrangements which may facilitate horizontal transmission of carbapenemases.

van Aartsen JJ, Moore CE, Parry CM, Turner P, Phot N, Mao S, Suy K, Davies T, Giess A, Sheppard AE et al. 2019. Epidemiology of paediatric gastrointestinal colonisation by extended spectrum cephalosporin-resistant Escherichia coli and Klebsiella pneumoniae isolates in north-west Cambodia. BMC Microbiol, 19 (1), pp. 59. | Show Abstract | Read more

BACKGROUND: Extended-spectrum cephalosporin resistance (ESC-R) in Escherichia coli and Klebsiella pneumoniae is a healthcare threat; high gastrointestinal carriage rates are reported from South-east Asia. Colonisation prevalence data in Cambodia are lacking. The aim of this study was to determine gastrointestinal colonisation prevalence of ESC-resistant E. coli (ESC-R-EC) and K. pneumoniae (ESC-R-KP) in Cambodian children/adolescents and associated socio-demographic risk factors; and to characterise relevant resistance genes, their genetic contexts, and the genetic relatedness of ESC-R strains using whole genome sequencing (WGS). RESULTS: Faeces and questionnaire data were obtained from individuals < 16 years in north-western Cambodia, 2012. WGS of cultured ESC-R-EC/KP was performed (Illumina). Maximum likelihood phylogenies were used to characterise relatedness of isolates; ESC-R-associated resistance genes and their genetic contexts were identified from de novo assemblies using BLASTn and automated/manual annotation. 82/148 (55%) of children/adolescents were ESC-R-EC/KP colonised; 12/148 (8%) were co-colonised with both species. Independent risk factors for colonisation were hospitalisation (OR: 3.12, 95% CI [1.52-6.38]) and intestinal parasites (OR: 3.11 [1.29-7.51]); school attendance conferred decreased risk (OR: 0.44 [0.21-0.92]. ESC-R strains were diverse; the commonest ESC-R mechanisms were blaCTX-M 1 and 9 sub-family variants. Structures flanking these genes were highly variable, and for blaCTX-M-15, - 55 and - 27 frequently involved IS26. Chromosomal blaCTX-M integration was common in E. coli. CONCLUSIONS: Gastrointestinal ESC-R-EC/KP colonisation is widespread in Cambodian children/adolescents; hospital admission and intestinal parasites are independent risk factors. The genetic contexts of blaCTX-M are highly mosaic, consistent with rapid horizontal exchange. Chromosomal integration of blaCTX-M may result in stable propagation in these community-associated pathogens.

Dingle KE, Didelot X, Quan TP, Eyre DW, Stoesser N, Marwick CA, Coia J, Brown D, Buchanan S, Ijaz UZ et al. 2019. A Role for Tetracycline Selection in Recent Evolution of Agriculture-Associated Clostridium difficile PCR Ribotype 078. mBio, 10 (2), | Show Abstract | Read more

The increasing clinical importance of human infections (frequently severe) caused by Clostridium difficile PCR ribotype 078 (RT078) was first reported in 2008. The severity of symptoms (mortality of ≤30%) and the higher proportion of infections among community and younger patients raised concerns. Farm animals, especially pigs, have been identified as RT078 reservoirs. We aimed to understand the recent changes in RT078 epidemiology by investigating a possible role for antimicrobial selection in its recent evolutionary history. Phylogenetic analysis of international RT078 genomes (isolates from 2006 to 2014, n = 400), using time-scaled, recombination-corrected, maximum likelihood phylogenies, revealed several recent clonal expansions. A common ancestor of each expansion had independently acquired a different allele of the tetracycline resistance gene tetM Consequently, an unusually high proportion (76.5%) of RT078 genomes were tetM positive. Multiple additional tetracycline resistance determinants were also identified (including efflux pump tet40), frequently sharing a high level of nucleotide sequence identity (up to 100%) with sequences found in the pig pathogen Streptococcus suis and in other zoonotic pathogens such as Campylobacter jejuni and Campylobacter coli Each RT078 tetM clonal expansion lacked geographic structure, indicating rapid, recent international spread. Resistance determinants for C. difficile infection-triggering antimicrobials, including fluoroquinolones and clindamycin, were comparatively rare in RT078. Tetracyclines are used intensively in agriculture; this selective pressure, plus rapid, international spread via the food chain, may explain the increased RT078 prevalence in humans. Our work indicates that the use of antimicrobials outside the health care environment has selected for resistant organisms, and in the case of RT078, has contributed to the emergence of a human pathogen.IMPORTANCEClostridium difficile PCR ribotype 078 (RT078) has multiple reservoirs; many are agricultural. Since 2005, this genotype has been increasingly associated with human infections in both clinical settings and the community. Investigations of RT078 whole-genome sequences revealed that tetracycline resistance had been acquired on multiple independent occasions. Phylogenetic analysis revealed a rapid, recent increase in numbers of closely related tetracycline-resistant RT078 (clonal expansions), suggesting that tetracycline selection has strongly influenced its recent evolutionary history. We demonstrate recent international spread of emergent, tetracycline-resistant RT078. A similar tetracycline-positive clonal expansion was also identified in unrelated nontoxigenic C. difficile, suggesting that this process may be widespread and may be independent of disease-causing ability. Resistance to typical C. difficile infection-associated antimicrobials (e.g., fluoroquinolones, clindamycin) occurred only sporadically within RT078. Selective pressure from tetracycline appears to be a key factor in the emergence of this human pathogen and the rapid international dissemination that followed, plausibly via the food chain.

Uyoga S, Mpoya A, Olupot-Olupot P, Kiguli S, Opoka RO, Engoru C, Mallewa M, Kennedy N, M'baya B, Kyeyune D et al. 2019. Haematological quality and age of donor blood issued for paediatric transfusion to four hospitals in sub-Saharan Africa. Vox Sang, 114 (4), pp. 340-348. | Show Abstract | Read more

BACKGROUND AND OBJECTIVES: Paediatric blood transfusion for severe anaemia in hospitals in sub-Saharan Africa remains common. Yet, reports describing the haematological quality of donor blood or storage duration in routine practice are very limited. Both factors are likely to affect transfusion outcomes. MATERIALS AND METHODS: We undertook three audits examining the distribution of pack types, haematological quality and storage duration of donor blood used in a paediatric clinical trial of blood at four hospitals in Africa (Uganda and Malawi). RESULTS: The overall distribution of whole blood, packed cells (plasma-reduced by centrifugation) and red cell concentrates (RCC) (plasma-reduced by gravity-dependent sedimentation) used in a randomised trial was 40·7% (N = 1215), 22·4% (N = 669) and 36·8% (N = 1099), respectively. The first audit found similar median haematocrits of 57·0% (50·0,74·0), 64·0% (52·0,72·5; P = 0·238 vs. whole blood) and 56·0% (48·0,67·0; P = 0·462) in whole blood, RCC and packed cells, respectively, which resulted from unclear pack labelling by blood transfusion services (BTS). Re-training of the BTS, hospital blood banks and clinical teams led to, in subsequent audits, significant differences in median haematocrit and haemoglobins across the three pack types and values within expected ranges. Median storage duration time was 12 days (IQR: 6, 19) with 18·2% (537/2964) over 21 days in storage. Initially, 9 (2·8%) packs were issued past the recommended duration of storage, dropping to 0·3% (N = 7) in the third audit post-training. CONCLUSION: The study highlights the importance of close interactions and education between BTS and clinical services and the importance of haemovigilance to ensure safe transfusion practice.

Pouwels KB, Hopkins S, Llewelyn MJ, Walker AS, McNulty CA, Robotham JV. 2019. Duration of antibiotic treatment for common infections in English primary care: cross sectional analysis and comparison with guidelines. BMJ, 364 pp. l440. | Show Abstract | Read more

OBJECTIVE: To evaluate the duration of prescriptions for antibiotic treatment for common infections in English primary care and to compare this with guideline recommendations. DESIGN: Cross sectional study. SETTING: General practices contributing to The Health Improvement Network database, 2013-15. PARTICIPANTS: 931 015 consultations that resulted in an antibiotic prescription for one of several indications: acute sinusitis, acute sore throat, acute cough and bronchitis, pneumonia, acute exacerbation of chronic obstructive pulmonary disease (COPD), acute otitis media, acute cystitis, acute prostatitis, pyelonephritis, cellulitis, impetigo, scarlet fever, and gastroenteritis. MAIN OUTCOME MEASURES: The main outcomes were the percentage of antibiotic prescriptions with a duration exceeding the guideline recommendation and the total number of days beyond the recommended duration for each indication. RESULTS: The most common reasons for antibiotics being prescribed were acute cough and bronchitis (386 972, 41.6% of the included consultations), acute sore throat (239 231, 25.7%), acute otitis media (83 054, 8.9%), and acute sinusitis (76 683, 8.2%). Antibiotic treatments for upper respiratory tract indications and acute cough and bronchitis accounted for more than two thirds of the total prescriptions considered, and 80% or more of these treatment courses exceeded guideline recommendations. Notable exceptions were acute sinusitis, where only 9.6% (95% confidence interval 9.4% to 9.9%) of prescriptions exceeded seven days and acute sore throat where only 2.1% (2.0% to 2.1%) exceeded 10 days (recent guidance recommends five days). More than half of the antibiotic prescriptions were for longer than guidelines recommend for acute cystitis among females (54.6%, 54.1% to 55.0%). The percentage of antibiotic prescriptions exceeding the recommended duration was lower for most non-respiratory infections. For the 931 015 included consultations resulting in antibiotic prescriptions, about 1.3 million days were beyond the durations recommended by guidelines. CONCLUSION: For most common infections treated in primary care, a substantial proportion of antibiotic prescriptions have durations exceeding those recommended in guidelines. Substantial reductions in antibiotic exposure can be accomplished by aligning antibiotic prescription durations with guidelines.

Prendergast AJ, Berejena C, Pimundu G, Shonhai A, Bwakura-Dangarembizi M, Musiime V, Szubert AJ, Cook AD, Spyer MJ, Nahirya-Ntege P et al. 2019. Inflammatory biomarkers in HIV-infected children hospitalized for severe malnutrition in Uganda and Zimbabwe. AIDS, 33 (9), pp. 1485-1490. | Show Abstract | Read more

OBJECTIVES: A proportion of HIV-infected children with advanced disease develop severe malnutrition soon after antiretroviral therapy (ART) initiation. We tested the hypothesis that systemic inflammation underlies the pathogenesis of severe malnutrition in HIV-infected children. DESIGN: Cross-sectional laboratory substudy in 613 HIV-infected children initiating ART in Uganda and Zimbabwe. METHODS: We measured C-reactive protein (CRP), TNFα, IL-6 and soluble CD14 by ELISA in cryopreserved plasma at baseline (pre-ART) and week-4 (children with severe malnutrition only). Independent associations between baseline biomarkers and subsequent hospitalization for severe malnutrition were identified using multivariable fractional polynomial logistic regression. RESULTS: Compared with children without severe malnutrition (n = 574, median age 6.3 years, median baseline weight-for-age Z-score -2.2), children hospitalized for severe malnutrition post-ART (n = 39, median age 2.3 years, median baseline weight-for-age Z-score -4.8) had higher baseline CRP [median 13.5 (interquartile range 5.5, 41.1) versus 4.1 (1.4, 14.4) mg/l; P = 0.003] and IL-6 [median 9.2 (6.7, 15.6) versus 5.9 (4.6, 9.3) pg/ml; P < 0.0001], but similar overall TNFα, soluble CD14 and HIV viral load (all P > 0.06). In a multivariable model, higher pre-ART IL-6, lower TNFα and lower weight-for-age were independently associated with subsequent hospitalization for severe malnutrition. Between weeks 0 and 4, there was a significant rise in CRP, IL-6 and soluble CD14, and fall in TNFα and HIV viral load in children hospitalized for severe malnutrition (all P < 0.02). CONCLUSION: Pre-ART IL-6 and TNFα were more strongly associated with hospitalization for severe malnutrition than CD4 cell count or viral load, highlighting the importance of inflammation at the time of ART initiation in HIV-infected children.

Bwakura-Dangarembizi M, Amadi B, Bourke CD, Robertson RC, Mwapenya B, Chandwe K, Kapoma C, Chifunda K, Majo F, Ngosa D et al. 2019. Health Outcomes, Pathogenesis and Epidemiology of Severe Acute Malnutrition (HOPE-SAM): rationale and methods of a longitudinal observational study. BMJ Open, 9 (1), pp. e023077. | Show Abstract | Read more

INTRODUCTION: Mortality among children hospitalised for complicated severe acute malnutrition (SAM) remains high despite the implementation of WHO guidelines, particularly in settings of high HIV prevalence. Children continue to be at high risk of morbidity, mortality and relapse after discharge from hospital although long-term outcomes are not well documented. Better understanding the pathogenesis of SAM and the factors associated with poor outcomes may inform new therapeutic interventions. METHODS AND ANALYSIS: The Health Outcomes, Pathogenesis and Epidemiology of Severe Acute Malnutrition (HOPE-SAM) study is a longitudinal observational cohort that aims to evaluate the short-term and long-term clinical outcomes of HIV-positive and HIV-negative children with complicated SAM, and to identify the risk factors at admission and discharge from hospital that independently predict poor outcomes. Children aged 0-59 months hospitalised for SAM are being enrolled at three tertiary hospitals in Harare, Zimbabwe and Lusaka, Zambia. Longitudinal mortality, morbidity and nutritional data are being collected at admission, discharge and for 48 weeks post discharge. Nested laboratory substudies are exploring the role of enteropathy, gut microbiota, metabolomics and cellular immune function in the pathogenesis of SAM using stool, urine and blood collected from participants and from well-nourished controls. ETHICS AND DISSEMINATION: The study is approved by the local and international institutional review boards in the participating countries (the Joint Research Ethics Committee of the University of Zimbabwe, Medical Research Council of Zimbabwe and University of Zambia Biomedical Research Ethics Committee) and the study sponsor (Queen Mary University of London). Caregivers provide written informed consent for each participant. Findings will be disseminated through peer-reviewed journals, conference presentations and to caregivers at face-to-face meetings.

Li H-K, Rombach I, Zambellas R, Walker AS, McNally MA, Atkins BL, Lipsky BA, Hughes HC, Bose D, Kümin M et al. 2019. Oral versus Intravenous Antibiotics for Bone and Joint Infection. N Engl J Med, 380 (5), pp. 425-436. | Show Abstract | Read more

BACKGROUND: The management of complex orthopedic infections usually includes a prolonged course of intravenous antibiotic agents. We investigated whether oral antibiotic therapy is noninferior to intravenous antibiotic therapy for this indication. METHODS: We enrolled adults who were being treated for bone or joint infection at 26 U.K. centers. Within 7 days after surgery (or, if the infection was being managed without surgery, within 7 days after the start of antibiotic treatment), participants were randomly assigned to receive either intravenous or oral antibiotics to complete the first 6 weeks of therapy. Follow-on oral antibiotics were permitted in both groups. The primary end point was definitive treatment failure within 1 year after randomization. In the analysis of the risk of the primary end point, the noninferiority margin was 7.5 percentage points. RESULTS: Among the 1054 participants (527 in each group), end-point data were available for 1015 (96.3%). Treatment failure occurred in 74 of 506 participants (14.6%) in the intravenous group and 67 of 509 participants (13.2%) in the oral group. Missing end-point data (39 participants, 3.7%) were imputed. The intention-to-treat analysis showed a difference in the risk of definitive treatment failure (oral group vs. intravenous group) of -1.4 percentage points (90% confidence interval [CI], -4.9 to 2.2; 95% CI, -5.6 to 2.9), indicating noninferiority. Complete-case, per-protocol, and sensitivity analyses supported this result. The between-group difference in the incidence of serious adverse events was not significant (146 of 527 participants [27.7%] in the intravenous group and 138 of 527 [26.2%] in the oral group; P=0.58). Catheter complications, analyzed as a secondary end point, were more common in the intravenous group (9.4% vs. 1.0%). CONCLUSIONS: Oral antibiotic therapy was noninferior to intravenous antibiotic therapy when used during the first 6 weeks for complex orthopedic infection, as assessed by treatment failure at 1 year. (Funded by the National Institute for Health Research; OVIVA Current Controlled Trials number, ISRCTN91566927 .).

Yang Y, Walker TM, Walker AS, Wilson DJ, Peto TEA, Crook DW, Shamout F, CRyPTIC Consortium, Zhu T, Clifton DA. 2019. DeepAMR for predicting co-occurrent resistance of Mycobacterium tuberculosis. Bioinformatics, 35 (18), pp. 3240-3249. | Show Abstract | Read more

MOTIVATION: Resistance co-occurrence within first-line anti-tuberculosis (TB) drugs is a common phenomenon. Existing methods based on genetic data analysis of Mycobacterium tuberculosis (MTB) have been able to predict resistance of MTB to individual drugs, but have not considered the resistance co-occurrence and cannot capture latent structure of genomic data that corresponds to lineages. RESULTS: We used a large cohort of TB patients from 16 countries across six continents where whole-genome sequences for each isolate and associated phenotype to anti-TB drugs were obtained using drug susceptibility testing recommended by the World Health Organization. We then proposed an end-to-end multi-task model with deep denoising auto-encoder (DeepAMR) for multiple drug classification and developed DeepAMR_cluster, a clustering variant based on DeepAMR, for learning clusters in latent space of the data. The results showed that DeepAMR outperformed baseline model and four machine learning models with mean AUROC from 94.4% to 98.7% for predicting resistance to four first-line drugs [i.e. isoniazid (INH), ethambutol (EMB), rifampicin (RIF), pyrazinamide (PZA)], multi-drug resistant TB (MDR-TB) and pan-susceptible TB (PANS-TB: MTB that is susceptible to all four first-line anti-TB drugs). In the case of INH, EMB, PZA and MDR-TB, DeepAMR achieved its best mean sensitivity of 94.3%, 91.5%, 87.3% and 96.3%, respectively. While in the case of RIF and PANS-TB, it generated 94.2% and 92.2% sensitivity, which were lower than baseline model by 0.7% and 1.9%, respectively. t-SNE visualization shows that DeepAMR_cluster captures lineage-related clusters in the latent space. AVAILABILITY AND IMPLEMENTATION: The details of source code are provided at http://www.robots.ox.ac.uk/∼davidc/code.php. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

Tenforde MW, Walker AS, Gibb DM, Manabe YC. 2019. Rapid antiretroviral therapy initiation in low- and middle-income countries: A resource-based approach. PLoS Med, 16 (1), pp. e1002723. | Show Abstract | Read more

In an Essay, Mark Tenforde and colleagues advocate continued provision of baseline CD4 cell count testing in HIV care in low- and middle-income countries.

Kelly C, Mwandumba HC, Heyderman RS, Jambo K, Kamng'ona R, Chammudzi M, Sheha I, Peterson I, Rapala A, Mallewa J et al. 2019. HIV-Related Arterial Stiffness in Malawian Adults Is Associated With the Proportion of PD-1-Expressing CD8+ T Cells and Reverses With Antiretroviral Therapy. J Infect Dis, 219 (12), pp. 1948-1958. | Show Abstract | Read more

BACKGROUND: The contribution of immune activation to arterial stiffness and its reversibility in human immunodeficiency virus (HIV)-infected adults in sub-Saharan Africa is unknown. METHODS: HIV-uninfected and HIV-infected Malawian adults initiating antiretroviral therapy (ART) with a CD4+ T-cell count of <100 cells/μL were enrolled and followed for 44 weeks; enrollment of infected adults occurred 2 weeks after ART initiation. We evaluated the relationship between carotid femoral pulse wave velocity (cfPWV) and T-cell activation (defined as HLA-DR+CD38+ T cells), exhaustion (define as PD-1+ T cells), and senescence (defined as CD57+ T cells) and monocyte subsets, using normal regression. RESULTS: In 279 HIV-infected and 110 HIV-uninfected adults, 142 (37%) had hypertension. HIV was independently associated with a 12% higher cfPWV (P = .02) at baseline and a 14% higher cfPWV at week 10 (P = .02), but the increases resolved by week 22. CD4+ and CD8+ T-cell exhaustion were independently associated with a higher cfPWV at baseline (P = .02). At 44 weeks, arterial stiffness improved more in those with greater decreases in the percentage of CD8+ T cells and the percentage of PD-1+CD8+ T cells (P = .01 and P = .03, respectively). When considering HIV-infected participants alone, the adjusted arterial stiffness at week 44 tended to be lower in those with higher baseline percentage of PD-1+CD8+ T cells (P = .054). CONCLUSIONS: PD-1+CD8+ T-cells are associated with HIV-related arterial stiffness, which remains elevated during the first 3 months of ART. Resources to prevent cardiovascular disease in sub-Saharan Africa should focus on blood pressure reduction and individuals with a low CD4+ T-cell count during early ART.

Thomas R, Velaphi S, Ellis S, Walker AS, Standing JF, Heath P, Sharland M, Dona' D. 2019. The use of polymyxins to treat carbapenem resistant infections in neonates and children. Expert Opin Pharmacother, 20 (4), pp. 415-422. | Show Abstract | Read more

INTRODUCTION: The incidence of healthcare-associated multidrug resistant bacterial infections, particularly due to carbapenem resistant organisms, has been on the rise globally. Among these are the carbapenem resistant Acinetobacter baumannii and Enterobacteriaceae, which have been responsible for numerous outbreaks in neonatal units. The polymyxins (colistin and polymyxin B) are considered to be the last resort antibiotics for treating such infections. However, pharmacokinetic and pharmacodynamic data on the use of polymyxins in neonates and children are very limited, and there are safety concerns. AREAS COVERED: In this review, the authors summarize the global burden of multidrug resistance, particularly carbapenem resistance, in the neonatal and paediatric population, and the potential wider use of polymyxins in treating these infections. EXPERT OPINION: Both colistin and polymyxin B have similar efficacy in treating multidrug resistant infections but have safety concerns. However, polymyxin B appears to be a better therapeutic option, with more rapid and higher steady state concentrations achieved compared to colistin and less reported nephrotoxicity. There is virtually no data in neonates and children currently; there is therefore an urgent need for pharmacokinetic and safety trials in these populations to determine the optimal drug and dosing regimens and provide recommendations for their use against carbapenem resistant infections.

Eyre DW, Shaw R, Adams H, Cooper T, Crook DW, Griffin R-M, Mannion P, Morgan M, Morris T, Perry M et al. 2019. WGS to determine the extent of Clostridioides difficile transmission in a high incidence setting in North Wales in 2015. J Antimicrob Chemother, 74 (4), pp. 1092-1100. | Show Abstract | Read more

OBJECTIVES: Rates of Clostridioides (Clostridium) difficile infection (CDI) are higher in North Wales than elsewhere in the UK. We used WGS to investigate if this is due to increased healthcare-associated transmission from other cases. METHODS: Healthcare and community C. difficile isolates from patients across North Wales (February-July 2015) from glutamate dehydrogenase (GDH)-positive faecal samples underwent WGS. Data from patient records, hospital management systems and national antimicrobial use surveillance were used. RESULTS: Of the 499 GDH-positive samples, 338 (68%) were sequenced and 299 distinct infections/colonizations were identified, 229/299 (77%) with toxin genes. Only 39/229 (17%) toxigenic isolates were related within ≤2 SNPs to ≥1 infections/colonizations from a previously sampled patient, i.e. demonstrated evidence of possible transmission. Independent predictors of possible transmission included healthcare exposure in the last 12 weeks (P = 0.002, with rates varying by hospital), infection with MLST types ST-1 (ribotype 027) and ST-11 (predominantly ribotype 078) compared with all other toxigenic STs (P < 0.001), and cephalosporin exposure in the potential transmission recipient (P = 0.02). Adjusting for all these factors, there was no additional effect of ward workload (P = 0.54) or failure to meet cleaning targets (P = 0.25). Use of antimicrobials is higher in North Wales compared with England and the rest of Wales. CONCLUSIONS: Levels of transmission detected by WGS were comparable to previously described rates in endemic settings; other explanations, such as variations in antimicrobial use, are required to explain the high levels of CDI. Cephalosporins are a risk factor for infection with C. difficile from another infected or colonized case.

Sheppard AE, Stoesser N, German-Mesner I, Vegesana K, Sarah Walker A, Crook DW, Mathers AJ. 2018. TETyper: a bioinformatic pipeline for classifying variation and genetic contexts of transposable elements from short-read whole-genome sequencing data. Microb Genom, 4 (12), | Show Abstract | Read more

Much of the worldwide dissemination of antibiotic resistance has been driven by resistance gene associations with mobile genetic elements (MGEs), such as plasmids and transposons. Although increasing, our understanding of resistance spread remains relatively limited, as methods for tracking mobile resistance genes through multiple species, strains and plasmids are lacking. We have developed a bioinformatic pipeline for tracking variation within, and mobility of, specific transposable elements (TEs), such as transposons carrying antibiotic-resistance genes. TETyper takes short-read whole-genome sequencing data as input and identifies single-nucleotide mutations and deletions within the TE of interest, to enable tracking of specific sequence variants, as well as the surrounding genetic context(s), to enable identification of transposition events. A major advantage of TETyper over previous methods is that it does not require a genome reference. To investigate global dissemination of Klebsiella pneumoniae carbapenemase (KPC) and its associated transposon Tn4401, we applied TETyper to a collection of over 3000 publicly available Illumina datasets containing blaKPC. This revealed surprising diversity, with over 200 distinct flanking genetic contexts for Tn4401, indicating high levels of transposition. Integration of sample metadata revealed insights into associations between geographic locations, host species, Tn4401 sequence variants and flanking genetic contexts. To demonstrate the ability of TETyper to cope with high-copy-number TEs and to track specific short-term evolutionary changes, we also applied it to the insertion sequence IS26 within a defined K. pneumoniae outbreak. TETyper is implemented in python and is freely available at https://github.com/aesheppard/TETyper.

Kityo C, Szubert AJ, Siika A, Heyderman R, Bwakura-Dangarembizi M, Lugemwa A, Mwaringa S, Griffiths A, Nkanya I, Kabahenda S et al. 2018. Raltegravir-intensified initial antiretroviral therapy in advanced HIV disease in Africa: A randomised controlled trial. PLoS Med, 15 (12), pp. e1002706. | Show Abstract | Read more

BACKGROUND: In sub-Saharan Africa, individuals infected with HIV who are severely immunocompromised have high mortality (about 10%) shortly after starting antiretroviral therapy (ART). This group also has the greatest risk of morbidity and mortality associated with immune reconstitution inflammatory syndrome (IRIS), a paradoxical response to successful ART. Integrase inhibitors lead to significantly more rapid declines in HIV viral load (VL) than all other ART classes. We hypothesised that intensifying standard triple-drug ART with the integrase inhibitor, raltegravir, would reduce HIV VL faster and hence reduce early mortality, although this strategy could also risk more IRIS events. METHODS AND FINDINGS: In a 2×2×2 factorial open-label parallel-group trial, treatment-naive adults, adolescents, and children >5 years old infected with HIV, with cluster of differentiation 4 (CD4) <100 cells/mm3, from eight urban/peri-urban HIV clinics at regional hospitals in Kenya, Malawi, Uganda, and Zimbabwe were randomised 1:1 to initiate standard triple-drug ART, with or without 12-week raltegravir intensification, and followed for 48 weeks. The primary outcome was 24-week mortality, analysed by intention to treat. Of 2,356 individuals screened for eligibility, 1,805 were randomised between 18 June 2013 and 10 April 2015. Of the 1,805 participants, 961 (53.2%) were male, 72 (4.0%) were children/adolescents, median age was 36 years, CD4 count was 37 cells/mm3, and plasma viraemia was 249,770 copies/mL. Fifty-six participants (3.1%) were lost to follow-up at 48 weeks. By 24 weeks, 97/902 (10.9%) raltegravir-intensified ART versus 91/903 (10.2%) standard ART participants had died (adjusted hazard ratio [aHR] = 1.10 [95% CI 0.82-1.46], p = 0.53), with no evidence of interaction with other randomisations (pheterogeneity > 0.7) and despite significantly greater VL suppression with raltegravir-intensified ART at 4 weeks (343/836 [41.0%] versus 113/841 [13.4%] with standard ART, p < 0.001) and 12 weeks (567/789 [71.9%] versus 415/803 [51.7%] with standard ART, p < 0.001). Through 48 weeks, there was no evidence of differences in mortality (aHR = 0.98 [95% CI 0.76-1.28], p = 0.91); in serious (aHR = 0.99 [0.81-1.21], p = 0.88), grade-4 (aHR = 0.88 [0.71-1.09], p = 0.29), or ART-modifying (aHR = 0.90 [0.63-1.27], p = 0.54) adverse events (the latter occurring in 59 [6.5%] participants with raltegravir-intensified ART versus 66 [7.3%] with standard ART); in events judged compatible with IRIS (occurring in 89 [9.9%] participants with raltegravir-intensified ART versus 86 [9.5%] with standard ART, p = 0.79) or in hospitalisations (aHR = 0.94 [95% CI 0.76-1.17], p = 0.59). At 12 weeks, one and two raltegravir-intensified participants had predicted intermediate-level and high-level raltegravir resistance, respectively. At 48 weeks, the nucleoside reverse transcriptase inhibitor (NRTI) mutation K219E/Q (p = 0.004) and the non-nucleoside reverse transcriptase inhibitor (NNRTI) mutations K101E/P (p = 0.03) and P225H (p = 0.007) were less common in virus from participants with raltegravir-intensified ART, with weak evidence of less intermediate- or high-level resistance to tenofovir (p = 0.06), abacavir (p = 0.08), and rilpivirine (p = 0.07). Limitations of the study include limited clinical, radiological, and/or microbiological information for some participants, reflecting available services at the centres, and lack of baseline genotypes. CONCLUSIONS: Although 12 weeks of raltegravir intensification was well tolerated and reduced HIV viraemia significantly faster than standard triple-drug ART during the time of greatest risk for early death, this strategy did not reduce mortality or clinical events in this group and is not warranted. There was no excess of IRIS-compatible events, suggesting that integrase inhibitors can be used safely as part of standard triple-drug first-line therapy in severely immunocompromised individuals. TRIAL REGISTRATION: ClinicalTrials.gov NCT01825031. TRIAL REGISTRATION: International Standard Randomised Controlled Trials Number ISRCTN 43622374.

Kouchaki S, Yang Y, Walker TM, Sarah Walker A, Wilson DJ, Peto TEA, Crook DW, CRyPTIC Consortium, Clifton DA. 2019. Application of machine learning techniques to tuberculosis drug resistance analysis. Bioinformatics, 35 (13), pp. 2276-2282. | Show Abstract | Read more

MOTIVATION: Timely identification of Mycobacterium tuberculosis (MTB) resistance to existing drugs is vital to decrease mortality and prevent the amplification of existing antibiotic resistance. Machine learning methods have been widely applied for timely predicting resistance of MTB given a specific drug and identifying resistance markers. However, they have been not validated on a large cohort of MTB samples from multi-centers across the world in terms of resistance prediction and resistance marker identification. Several machine learning classifiers and linear dimension reduction techniques were developed and compared for a cohort of 13 402 isolates collected from 16 countries across 6 continents and tested 11 drugs. RESULTS: Compared to conventional molecular diagnostic test, area under curve of the best machine learning classifier increased for all drugs especially by 23.11%, 15.22% and 10.14% for pyrazinamide, ciprofloxacin and ofloxacin, respectively (P < 0.01). Logistic regression and gradient tree boosting found to perform better than other techniques. Moreover, logistic regression/gradient tree boosting with a sparse principal component analysis/non-negative matrix factorization step compared with the classifier alone enhanced the best performance in terms of F1-score by 12.54%, 4.61%, 7.45% and 9.58% for amikacin, moxifloxacin, ofloxacin and capreomycin, respectively, as well increasing area under curve for amikacin and capreomycin. Results provided a comprehensive comparison of various techniques and confirmed the application of machine learning for better prediction of the large diverse tuberculosis data. Furthermore, mutation ranking showed the possibility of finding new resistance/susceptible markers. AVAILABILITY AND IMPLEMENTATION: The source code can be found at http://www.robots.ox.ac.uk/ davidc/code.php. SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.

Quan TP, Hope R, Clarke T, Moroney R, Butcher L, Knight P, Crook D, Hopkins S, Peto TEA, Johnson AP, Walker AS. 2018. Using linked electronic health records to report healthcare-associated infections. PLoS One, 13 (11), pp. e0206860. | Show Abstract | Read more

BACKGROUND: Reporting of strategic healthcare-associated infections (HCAIs) to Public Health England is mandatory for all acute hospital trusts in England, via a web-based HCAI Data Capture System (HCAI-DCS). AIM: Investigate the feasibility of automating the current, manual, HCAI reporting using linked electronic health records (linked-EHR), and assess its level of accuracy. METHODS: All data previously submitted through the HCAI-DCS by the Oxford University Hospitals infection control (IC) team for methicillin-resistant and methicillin-susceptible Staphylococcus aureus (MRSA, MSSA), Clostridium difficile, and Escherichia coli, through March 2017 were downloaded and compared to outputs created from linked-EHR, with detailed comparisons between 2013-2017. FINDINGS: Total MRSA, MSSA, E. coli and C. difficile cases entered by the IC team vs linked-EHR were 428 vs 432, 795 vs 816, 2454 vs 2450 and 3365 vs 3393 respectively. From 2013-2017, most discrepancies (32/37 (86%)) were likely due to IC recording errors. Patient and specimen identifiers were completed for >98% of cases by both methods, with very high agreement (>97%). Fields relating to the patient at the time the specimen was taken were complete to a similarly high level (>99% IC, >97% linked-EHR), and agreement was fairly good (>80%) except for the main and treatment specialties (57% and 54% respectively) and the patient category (55%). Optional, organism-specific data-fields were less complete, by both methods. Where comparisons were possible, agreement was reasonably high (mostly 70-90%). CONCLUSION: Basic factual information, such as demographic data, is almost-certainly better automated, and many other data fields can potentially be populated successfully from linked-EHR. Manual data collection is time-consuming and inefficient; automated electronic data collection would leave healthcare professionals free to focus on clinical rather than administrative work.

Thwaites GE, Scarborough M, Szubert A, Saramago Goncalves P, Soares M, Bostock J, Nsutebu E, Tilley R, Cunningham R, Greig J et al. 2018. Adjunctive rifampicin to reduce early mortality from Staphylococcus aureus bacteraemia: the ARREST RCT. Health Technol Assess, 22 (59), pp. 1-148. | Show Abstract | Read more

BACKGROUND: Staphylococcus aureus bacteraemia is a common and frequently fatal infection. Adjunctive rifampicin may enhance early S. aureus killing, sterilise infected foci and blood faster, and thereby reduce the risk of dissemination, metastatic infection and death. OBJECTIVES: To determine whether or not adjunctive rifampicin reduces bacteriological (microbiologically confirmed) failure/recurrence or death through 12 weeks from randomisation. Secondary objectives included evaluating the impact of rifampicin on all-cause mortality, clinically defined failure/recurrence or death, toxicity, resistance emergence, and duration of bacteraemia; and assessing the cost-effectiveness of rifampicin. DESIGN: Parallel-group, randomised (1 : 1), blinded, placebo-controlled multicentre trial. SETTING: UK NHS trust hospitals. PARTICIPANTS: Adult inpatients (≥ 18 years) with meticillin-resistant or susceptible S. aureus grown from one or more blood cultures, who had received < 96 hours of antibiotic therapy for the current infection, and without contraindications to rifampicin. INTERVENTIONS: Adjunctive rifampicin (600-900 mg/day, oral or intravenous) or placebo for 14 days in addition to standard antibiotic therapy. Investigators and patients were blinded to trial treatment. Follow-up was for 12 weeks (assessments at 3, 7, 10 and 14 days, weekly until discharge and final assessment at 12 weeks post randomisation). MAIN OUTCOME MEASURES: The primary outcome was all-cause bacteriological (microbiologically confirmed) failure/recurrence or death through 12 weeks from randomisation. RESULTS: Between December 2012 and October 2016, 758 eligible participants from 29 UK hospitals were randomised: 370 to rifampicin and 388 to placebo. The median age was 65 years [interquartile range (IQR) 50-76 years]. A total of 485 (64.0%) infections were community acquired and 132 (17.4%) were nosocomial; 47 (6.2%) were caused by meticillin-resistant S. aureus. A total of 301 (39.7%) participants had an initial deep infection focus. Standard antibiotics were given for a median of 29 days (IQR 18-45 days) and 619 (81.7%) participants received flucloxacillin. By 12 weeks, 62 out of 370 (16.8%) patients taking rifampicin versus 71 out of 388 (18.3%) participants taking the placebo experienced bacteriological (microbiologically confirmed) failure/recurrence or died [absolute risk difference -1.4%, 95% confidence interval (CI) -7.0% to 4.3%; hazard ratio 0.96, 95% CI 0.68 to 1.35; p = 0.81]. There were 4 (1.1%) and 5 (1.3%) bacteriological failures (p = 0.82) in the rifampicin and placebo groups, respectively. There were 3 (0.8%) versus 16 (4.1%) bacteriological recurrences (p = 0.01), and 55 (14.9%) versus 50 (12.9%) deaths without bacteriological failure/recurrence (p = 0.30) in the rifampicin and placebo groups, respectively. Over 12 weeks, there was no evidence of differences in clinically defined failure/recurrence/death (p = 0.84), all-cause mortality (p = 0.60), serious (p = 0.17) or grade 3/4 (p = 0.36) adverse events (AEs). However, 63 (17.0%) participants in the rifampicin group versus 39 (10.1%) participants in the placebo group experienced antibiotic or trial drug-modifying AEs (p = 0.004), and 24 (6.5%) participants in the rifampicin group versus 6 (1.5%) participants in the placebo group experienced drug-interactions (p = 0.0005). Evaluation of the costs and health-related quality-of-life impacts revealed that an episode of S. aureus bacteraemia costs an average of £12,197 over 12 weeks. Rifampicin was estimated to save 10% of episode costs (p = 0.14). After adjustment, the effect of rifampicin on total quality-adjusted life-years (QALYs) was positive (0.004 QALYs), but not statistically significant (standard error 0.004 QALYs). CONCLUSIONS: Adjunctive rifampicin provided no overall benefit over standard antibiotic therapy in adults with S. aureus bacteraemia. FUTURE WORK: Given the substantial mortality, other antibiotic combinations or improved source management should be investigated. TRIAL REGISTRATIONS: Current Controlled Trials ISRCTN37666216, EudraCT 2012-000344-10 and Clinical Trials Authorisation 00316/0243/001. FUNDING: This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 22, No. 59. See the NIHR Journals Library website for further project information.

Vihta K-D, Stoesser N, Llewelyn MJ, Quan TP, Davies T, Fawcett NJ, Dunn L, Jeffery K, Butler CC, Hayward G et al. 2018. Trends over time in Escherichia coli bloodstream infections, urinary tract infections, and antibiotic susceptibilities in Oxfordshire, UK, 1998-2016: a study of electronic health records. Lancet Infect Dis, 18 (10), pp. 1138-1149. | Show Abstract | Read more

BACKGROUND: Escherichia coli bloodstream infections are increasing in the UK and internationally. The evidence base to guide interventions against this major public health concern is small. We aimed to investigate possible drivers of changes in the incidence of E coli bloodstream infection and antibiotic susceptibilities in Oxfordshire, UK, over the past two decades, while stratifying for time since hospital exposure. METHODS: In this observational study, we used all available data on E coli bloodstream infections and E coli urinary tract infections (UTIs) from one UK region (Oxfordshire) using anonymised linked microbiological data and hospital electronic health records from the Infections in Oxfordshire Research Database (IORD). We estimated the incidence of infections across a two decade period and the annual incidence rate ratio (aIRR) in 2016. We modelled the data using negative binomial regression on the basis of microbiological, clinical, and health-care-exposure risk factors. We investigated infection severity, 30-day all-cause mortality, and community and hospital amoxicillin plus clavulanic acid (co-amoxiclav) use to estimate changes in bacterial virulence and the effect of antimicrobial resistance on incidence. FINDINGS: From Jan 1, 1998, to Dec 31, 2016, 5706 E coli bloodstream infections occurred in 5215 patients, and 228 376 E coli UTIs occurred in 137 075 patients. 1365 (24%) E coli bloodstream infections were nosocomial (onset >48 h after hospital admission), 1132 (20%) were quasi-nosocomial (≤30 days after discharge), 1346 (24%) were quasi-community (31-365 days after discharge), and 1863 (33%) were community (>365 days after hospital discharge). The overall incidence increased year on year (aIRR 1·06, 95% CI 1·05-1·06). In 2016, 212 (41%) of 515 E coli bloodstream infections and 3921 (28%) of 13 792 E coli UTIs were co-amoxiclav resistant. Increases in E coli bloodstream infections were driven by increases in community (aIRR 1·10, 95% CI 1·07-1·13; p<0·0001) and quasi-community (aIRR 1·08, 1·07-1·10; p<0·0001) cases. 30-day mortality associated with E coli bloodstream infection decreased over time in the nosocomial (adjusted rate ratio [RR] 0·98, 95% CI 0·96-1·00; p=0·03) group, and remained stable in the quasi-nosocomial (adjusted RR 0·98, 0·95-1·00; p=0·06), quasi-community (adjusted RR 0·99, 0·96-1·01; p=0·32), and community (adjusted RR 0·99, 0·96-1·01; p=0·21) groups. Mortality was, however, substantial at 14-25% across all hospital-exposure groups. Co-amoxiclav-resistant E coli bloodstream infections increased in all groups across the study period (by 11-18% per year, significantly faster than co-amoxiclav-susceptible E coli bloodstream infections; pheterogeneity<0·0001), as did co-amoxiclav-resistant E coli UTIs (by 14-29% per year; pheterogeneity<0·0001). Previous year co-amoxiclav use in primary-care facilities was associated with increased subsequent year community co-amoxiclav-resistant E coli UTIs (p=0·003). INTERPRETATION: Increases in E coli bloodstream infections in Oxfordshire are primarily community associated, with substantial co-amoxiclav resistance; nevertheless, we found little or no change in mortality. Focusing interventions on primary care facilities, particularly those with high co-amoxiclav use, could be effective in reducing the incidence of co-amoxiclav-resistant E coli bloodstream infections, in this region and more generally. FUNDING: National Institute for Health Research.

Eyre DW, Sheppard AE, Madder H, Moir I, Moroney R, Quan TP, Griffiths D, George S, Butcher L, Morgan M et al. 2018. A Candida auris Outbreak and Its Control in an Intensive Care Setting. N Engl J Med, 379 (14), pp. 1322-1331. | Show Abstract | Read more

BACKGROUND: Candida auris is an emerging and multidrug-resistant pathogen. Here we report the epidemiology of a hospital outbreak of C. auris colonization and infection. METHODS: After identification of a cluster of C. auris infections in the neurosciences intensive care unit (ICU) of the Oxford University Hospitals, United Kingdom, we instituted an intensive patient and environmental screening program and package of interventions. Multivariable logistic regression was used to identify predictors of C. auris colonization and infection. Isolates from patients and from the environment were analyzed by whole-genome sequencing. RESULTS: A total of 70 patients were identified as being colonized or infected with C. auris between February 2, 2015, and August 31, 2017; of these patients, 66 (94%) had been admitted to the neurosciences ICU before diagnosis. Invasive C. auris infections developed in 7 patients. When length of stay in the neurosciences ICU and patient vital signs and laboratory results were controlled for, the predictors of C. auris colonization or infection included the use of reusable skin-surface axillary temperature probes (multivariable odds ratio, 6.80; 95% confidence interval [CI], 2.96 to 15.63; P<0.001) and systemic fluconazole exposure (multivariable odds ratio, 10.34; 95% CI, 1.64 to 65.18; P=0.01). C. auris was rarely detected in the general environment. However, it was detected in isolates from reusable equipment, including multiple axillary skin-surface temperature probes. Despite a bundle of infection-control interventions, the incidence of new cases was reduced only after removal of the temperature probes. All outbreak sequences formed a single genetic cluster within the C. auris South African clade. The sequenced isolates from reusable equipment were genetically related to isolates from the patients. CONCLUSIONS: The transmission of C. auris in this hospital outbreak was found to be linked to reusable axillary temperature probes, indicating that this emerging pathogen can persist in the environment and be transmitted in health care settings. (Funded by the National Institute for Health Research Health Protection Research Unit in Healthcare Associated Infections and Antimicrobial Resistance at Oxford University and others.).

CRyPTIC Consortium and the 100,000 Genomes Project, Allix-Béguec C, Arandjelovic I, Bi L, Beckert P, Bonnet M, Bradley P, Cabibbe AM, Cancino-Muñoz I, Caulfield MJ et al. 2018. Prediction of Susceptibility to First-Line Tuberculosis Drugs by DNA Sequencing. N Engl J Med, 379 (15), pp. 1403-1415. | Show Abstract | Read more

BACKGROUND: The World Health Organization recommends drug-susceptibility testing of Mycobacterium tuberculosis complex for all patients with tuberculosis to guide treatment decisions and improve outcomes. Whether DNA sequencing can be used to accurately predict profiles of susceptibility to first-line antituberculosis drugs has not been clear. METHODS: We obtained whole-genome sequences and associated phenotypes of resistance or susceptibility to the first-line antituberculosis drugs isoniazid, rifampin, ethambutol, and pyrazinamide for isolates from 16 countries across six continents. For each isolate, mutations associated with drug resistance and drug susceptibility were identified across nine genes, and individual phenotypes were predicted unless mutations of unknown association were also present. To identify how whole-genome sequencing might direct first-line drug therapy, complete susceptibility profiles were predicted. These profiles were predicted to be susceptible to all four drugs (i.e., pansusceptible) if they were predicted to be susceptible to isoniazid and to the other drugs or if they contained mutations of unknown association in genes that affect susceptibility to the other drugs. We simulated the way in which the negative predictive value changed with the prevalence of drug resistance. RESULTS: A total of 10,209 isolates were analyzed. The largest proportion of phenotypes was predicted for rifampin (9660 [95.4%] of 10,130) and the smallest was predicted for ethambutol (8794 [89.8%] of 9794). Resistance to isoniazid, rifampin, ethambutol, and pyrazinamide was correctly predicted with 97.1%, 97.5%, 94.6%, and 91.3% sensitivity, respectively, and susceptibility to these drugs was correctly predicted with 99.0%, 98.8%, 93.6%, and 96.8% specificity. Of the 7516 isolates with complete phenotypic drug-susceptibility profiles, 5865 (78.0%) had complete genotypic predictions, among which 5250 profiles (89.5%) were correctly predicted. Among the 4037 phenotypic profiles that were predicted to be pansusceptible, 3952 (97.9%) were correctly predicted. CONCLUSIONS: Genotypic predictions of the susceptibility of M. tuberculosis to first-line drugs were found to be correlated with phenotypic susceptibility to these drugs. (Funded by the Bill and Melinda Gates Foundation and others.).

Decraene V, Phan HTT, George R, Wyllie DH, Akinremi O, Aiken Z, Cleary P, Dodgson A, Pankhurst L, Crook DW et al. 2018. A Large, Refractory Nosocomial Outbreak of Klebsiella pneumoniae Carbapenemase-Producing Escherichia coli Demonstrates Carbapenemase Gene Outbreaks Involving Sink Sites Require Novel Approaches to Infection Control. Antimicrob Agents Chemother, 62 (12), | Show Abstract | Read more

Carbapenem-resistant Enterobacteriaceae (CRE) represent a health threat, but effective control interventions remain unclear. Hospital wastewater sites are increasingly being highlighted as important potential reservoirs. We investigated a large Klebsiella pneumoniae carbapenemase (KPC)-producing Escherichia coli outbreak and wider CRE incidence trends in the Central Manchester University Hospital NHS Foundation Trust (CMFT) (United Kingdom) over 8 years, to determine the impact of infection prevention and control measures. Bacteriology and patient administration data (2009 to 2017) were linked, and a subset of CMFT or regional hospital KPC-producing E. coli isolates (n = 268) were sequenced. Control interventions followed international guidelines and included cohorting, rectal screening (n = 184,539 screens), environmental sampling, enhanced cleaning, and ward closure and plumbing replacement. Segmented regression of time trends for CRE detections was used to evaluate the impact of interventions on CRE incidence. Genomic analysis (n = 268 isolates) identified the spread of a KPC-producing E. coli outbreak clone (strain A, sequence type 216 [ST216]; n = 125) among patients and in the environment, particularly on 2 cardiac wards (wards 3 and 4), despite control measures. ST216 strain A had caused an antecedent outbreak and shared its KPC plasmids with other E. coli lineages and Enterobacteriaceae species. CRE acquisition incidence declined after closure of wards 3 and 4 and plumbing replacement, suggesting an environmental contribution. However, ward 3/ward 4 wastewater sites were rapidly recolonized with CRE and patient CRE acquisitions recurred, albeit at lower rates. Patient relocation and plumbing replacement were associated with control of a clonal KPC-producing E. coli outbreak; however, environmental contamination with CRE and patient CRE acquisitions recurred rapidly following this intervention. The large numbers of cases and the persistence of blaKPC in E. coli, including pathogenic lineages, are of concern.

Wyllie DH, Robinson E, Peto T, Crook DW, Ajileye A, Rathod P, Allen R, Jarrett L, Smith EG, Walker AS. 2018. Identifying Mixed Mycobacterium tuberculosis Infection and Laboratory Cross-Contamination during Mycobacterial Sequencing Programs. J Clin Microbiol, 56 (11), | Show Abstract | Read more

The detection of laboratory cross-contamination and mixed tuberculosis infections is an important goal of clinical mycobacteriology laboratories. The objective of this study was to develop a method to detect mixtures of different Mycobacterium tuberculosis lineages in laboratories performing mycobacterial next-generation sequencing (NGS). The setting was the Public Health England National Mycobacteriology Laboratory Birmingham, which performs Illumina sequencing on DNA extracted from positive mycobacterial growth indicator tubes. We analyzed 4,156 samples yielding M. tuberculosis from 663 MiSeq runs, which were obtained during development and production use of a diagnostic process using NGS. The counts of the most common (major) variant and all other variants (nonmajor variants) were determined from reads mapping to positions defining M. tuberculosis lineages. Expected variation was estimated during process development. For each sample, we determined the nonmajor variant proportions at 55 sets of lineage-defining positions. The nonmajor variant proportion in the two most mixed lineage-defining sets (F2 metric) was compared with that of the 47 least-mixed lineage-defining sets (F47 metric). The following three patterns were observed: (i) not mixed by either metric; (ii) high F47 metric, suggesting mixtures of multiple lineages; and (iii) samples compatible with mixtures of two lineages, detected by differential F2 metric elevations relative to F47. Pattern ii was observed in batches, with similar patterns in the M. tuberculosis H37Rv control present in each run, and is likely to reflect cross-contamination. During production, the proportions of samples in the patterns were 97%, 2.8%, and 0.001%, respectively. The F2 and F47 metrics described could be used for laboratory process control in laboratories sequencing M. tuberculosis genomes.

Nambiar K, Seifert H, Rieg S, Kern WV, Scarborough M, Gordon NC, Kim HB, Song K-H, Tilley R, Gott H et al. 2018. Survival following Staphylococcus aureus bloodstream infection: A prospective multinational cohort study assessing the impact of place of care. J Infect, 77 (6), pp. 516-525. | Show Abstract | Read more

BACKGROUND: Staphylococcus aureus bloodstream infection (SAB) is a common, life-threatening infection with a high mortality. Survival can be improved by implementing quality of care bundles in hospitals. We previously observed marked differences in mortality between hospitals and now assessed whether mortality could serve as a valid and easy to implement quality of care outcome measure. METHODS: We conducted a prospective observational study between January 2013 and April 2015 on consecutive, adult patients with SAB from 11 tertiary care centers in Germany, South Korea, Spain, Taiwan, and the United Kingdom. Factors associated with mortality at 90 days were analyzed by Cox proportional hazards regression and flexible parametric models. RESULTS: 1851 patients with a median age of 66 years (64% male) were analyzed. Crude 90-day mortality differed significantly between hospitals (range 23-39%). Significant variation between centers was observed for methicillin-resistant S. aureus, community-acquisition, infective foci, as well as measures of comorbidities, and severity of disease. In multivariable analysis, factors independently associated with mortality at 90 days were age, nosocomial acquisition, unknown infective focus, pneumonia, Charlson comorbidity index, SOFA score, and study center. The risk of death varied over time differently for each infective focus. Crude mortality differed markedly from adjusted mortality. DISCUSSION: We observed significant differences in adjusted mortality between hospitals, suggesting differences in quality of care. However, mortality is strongly influenced by patient mix and thus, crude mortality is not a suitable quality indicator.

Thwaites GE, Szubert A, Walker AS. 2018. Rifampicin in treating S aureus bacteraemia - Authors' reply. Lancet, 392 (10147), pp. 555-556. | Read more

Fitzgerald FC, Lhomme E, Harris K, Kenny J, Doyle R, Kityo C, Shaw LP, Abongomera G, Musiime V, Cook A et al. 2019. Microbial Translocation Does Not Drive Immune Activation in Ugandan Children Infected With HIV. J Infect Dis, 219 (1), pp. 89-100. | Show Abstract | Read more

Objective: Immune activation is associated with morbidity and mortality during human immunodeficiency virus (HIV) infection, despite receipt of antiretroviral therapy (ART). We investigated whether microbial translocation drives immune activation in HIV-infected Ugandan children. Methods: Nineteen markers of immune activation and inflammation were measured over 96 weeks in HIV-infected Ugandan children in the CHAPAS-3 Trial and HIV-uninfected age-matched controls. Microbial translocation was assessed using molecular techniques, including next-generation sequencing. Results: Of 249 children included, 142 were infected with HIV; of these, 120 were ART naive, with a median age of 2.8 years (interquartile range [IQR], 1.7-4.0 years) and a median baseline CD4+ T-cell percentage of 20% (IQR, 14%-24%), and 22 were ART experienced, with a median age of 6.5 years (IQR, 5.9-9.2 years) and a median baseline CD4+ T-cell percentage of 35% (IQR, 31%-39%). The control group comprised 107 children without HIV infection. The median increase in the CD4+ T-cell percentage was 17 percentage points (IQR, 12-22 percentage points) at week 96 among ART-naive children, and the viral load was <100 copies/mL in 76% of ART-naive children and 91% of ART-experienced children. Immune activation decreased with ART use. Children could be divided on the basis of immune activation markers into the following 3 clusters: in cluster 1, the majority of children were HIV uninfected; cluster 2 comprised a mix of HIV-uninfected children and HIV-infected ART-naive or ART-experienced children; and in cluster 3, the majority were ART naive. Immune activation was low in cluster 1, decreased in cluster 3, and persisted in cluster 2. Blood microbial DNA levels were negative or very low across groups, with no difference between clusters except for Enterobacteriaceae organisms (the level was higher in cluster 1; P < .0001). Conclusion: Immune activation decreased with ART use, with marker clustering indicating different activation patterns according to HIV and ART status. Levels of bacterial DNA in blood were low regardless of HIV status, ART status, and immune activation status. Microbial translocation did not drive immune activation in this setting. Clinical Trials Registration: ISRCTN69078957.

Thompson JA, Kityo C, Dunn D, Hoppe A, Ndashimye E, Hakim J, Kambugu A, van Oosterhout JJ, Arribas J, Mugyenyi P et al. 2019. Evolution of Protease Inhibitor Resistance in Human Immunodeficiency Virus Type 1 Infected Patients Failing Protease Inhibitor Monotherapy as Second-line Therapy in Low-income Countries: An Observational Analysis Within the EARNEST Randomized Trial. Clin Infect Dis, 68 (7), pp. 1184-1192. | Show Abstract | Read more

BACKGROUND: Limited viral load (VL) testing in human immunodeficiency virus (HIV) treatment programs in low-income countries often delays detection of treatment failure. The impact of remaining on failing protease inhibitor (PI)-containing regimens is unclear. METHODS: We retrospectively tested VL in 2164 stored plasma samples from 386 patients randomized to receive lopinavir monotherapy (after initial raltegravir induction) in the Europe-Africa Research Network for Evaluation of Second-line Therapy (EARNEST) trial. Protease genotypic resistance testing was performed when VL >1000 copies/mL. We assessed evolution of PI resistance mutations from virological failure (confirmed VL >1000 copies/mL) until PI monotherapy discontinuation and examined associations using mixed-effects models. RESULTS: Median post-failure follow-up (in 118 patients) was 68 (interquartile range, 48-88) weeks. At failure, 20% had intermediate/high-level resistance to lopinavir. At 40-48 weeks post-failure, 68% and 51% had intermediate/high-level resistance to lopinavir and atazanavir; 17% had intermediate-level resistance (none high) to darunavir. Common PI mutations were M46I, I54V, and V82A. On average, 1.7 (95% confidence interval 1.5-2.0) PI mutations developed per year; increasing after the first mutation; decreasing with subsequent mutations (P < .0001). VL changes were modest, mainly driven by nonadherence (P = .006) and PI mutation development (P = .0002); I47A was associated with a larger increase in VL than other mutations (P = .05). CONCLUSIONS: Most patients develop intermediate/high-level lopinavir resistance within 1 year of ongoing viral replication on monotherapy but retain susceptibility to darunavir. Viral load increased slowly after failure, driven by non-adherence and PI mutation development. CLINICAL TRIALS REGISTRATION: NCT00988039.

Roope LSJ, Tonkin-Crine S, Butler CC, Crook D, Peto T, Peters M, Walker AS, Wordsworth S. 2018. Reducing demand for antibiotic prescriptions: evidence from an online survey of the general public on the interaction between preferences, beliefs and information, United Kingdom, 2015. Euro Surveill, 23 (25), pp. 13-23. | Show Abstract | Read more

BACKGROUND: Antimicrobial resistance (AMR), a major public health threat, is strongly associated with human antibiotic consumption. Influenza-like illnesses (ILI) account for substantial inappropriate antibiotic use; patient understanding and expectations probably play an important role. AIM: This study aimed to investigate what drives patient expectations of antibiotics for ILI and particularly whether AMR awareness, risk preferences (attitudes to taking risks with health) or time preferences (the extent to which people prioritise good health today over good health in the future) play a role. METHODS: In 2015, a representative online panel survey of 2,064 adults in the United Kingdom was asked about antibiotic use and effectiveness for ILI. Explanatory variables in multivariable regression included AMR awareness, risk and time preferences and covariates. RESULTS: The tendency not to prioritise immediate gain over later reward was independently strongly associated with greater awareness that antibiotics are inappropriate for ILI. Independently, believing antibiotics were effective for ILI and low AMR awareness significantly predicted reported antibiotic use. However, 272 (39%) of those with low AMR awareness said that the AMR information we provided would lead them to ask a doctor for antibiotics more often, significantly more than would do so less often, and in contrast to those with high AMR awareness (p < 0.0001). CONCLUSION: Information campaigns to reduce AMR may risk a paradoxical consequence of actually increasing public demand for antibiotics. Public antibiotic stewardship campaigns should be tested on a small scale before wider adoption.

Mason A, Foster D, Bradley P, Golubchik T, Doumith M, Gordon NC, Pichon B, Iqbal Z, Staves P, Crook D et al. 2018. Accuracy of Different Bioinformatics Methods in Detecting Antibiotic Resistance and Virulence Factors from Staphylococcus aureus Whole-Genome Sequences. J Clin Microbiol, 56 (9), | Show Abstract | Read more

In principle, whole-genome sequencing (WGS) can predict phenotypic resistance directly from a genotype, replacing laboratory-based tests. However, the contribution of different bioinformatics methods to genotype-phenotype discrepancies has not been systematically explored to date. We compared three WGS-based bioinformatics methods (Genefinder [read based], Mykrobe [de Bruijn graph based], and Typewriter [BLAST based]) for predicting the presence/absence of 83 different resistance determinants and virulence genes and overall antimicrobial susceptibility in 1,379 Staphylococcus aureus isolates previously characterized by standard laboratory methods (disc diffusion, broth and/or agar dilution, and PCR). In total, 99.5% (113,830/114,457) of individual resistance-determinant/virulence gene predictions were identical between all three methods, with only 627 (0.5%) discordant predictions, demonstrating high overall agreement (Fleiss' kappa = 0.98, P < 0.0001). Discrepancies when identified were in only one of the three methods for all genes except the cassette recombinase, ccrC(b). The genotypic antimicrobial susceptibility prediction matched the laboratory phenotype in 98.3% (14,224/14,464) of cases (2,720 [18.8%] resistant, 11,504 [79.5%] susceptible). There was greater disagreement between the laboratory phenotypes and the combined genotypic predictions (97 [0.7%] phenotypically susceptible, but all bioinformatic methods reported resistance; 89 [0.6%] phenotypically resistant, but all bioinformatics methods reported susceptible) than within the three bioinformatics methods (54 [0.4%] cases, 16 phenotypically resistant, 38 phenotypically susceptible). However, in 36/54 (67%) cases, the consensus genotype matched the laboratory phenotype. In this study, the choice between these three specific bioinformatic methods to identify resistance determinants or other genes in S. aureus did not prove critical, with all demonstrating high concordance with each other and phenotypic/molecular methods. However, each has some limitations; therefore, consensus methods provide some assurance.

Wyllie DH, Sanderson N, Myers R, Peto T, Robinson E, Crook DW, Smith EG, Walker AS. 2018. Control of Artifactual Variation in Reported Intersample Relatedness during Clinical Use of a Mycobacterium tuberculosis Sequencing Pipeline. J Clin Microbiol, 56 (8), | Show Abstract | Read more

Contact tracing requires reliable identification of closely related bacterial isolates. When we noticed the reporting of artifactual variation between Mycobacterium tuberculosis isolates during routine next-generation sequencing of Mycobacterium spp., we investigated its basis in 2,018 consecutive M. tuberculosis isolates. In the routine process used, clinical samples were decontaminated and inoculated into broth cultures; from positive broth cultures DNA was extracted and sequenced, reads were mapped, and consensus sequences were determined. We investigated the process of consensus sequence determination, which selects the most common nucleotide at each position. Having determined the high-quality read depth and depth of minor variants across 8,006 M. tuberculosis genomic regions, we quantified the relationship between the minor variant depth and the amount of nonmycobacterial bacterial DNA, which originates from commensal microbes killed during sample decontamination. In the presence of nonmycobacterial bacterial DNA, we found significant increases in minor variant frequencies, of more than 1.5-fold, in 242 regions covering 5.1% of the M. tuberculosis genome. Included within these were four high-variation regions strongly influenced by the amount of nonmycobacterial bacterial DNA. Excluding these four regions from pairwise distance comparisons reduced biologically implausible variation from 5.2% to 0% in an independent validation set derived from 226 individuals. Thus, we demonstrated an approach identifying critical genomic regions contributing to clinically relevant artifactual variation in bacterial similarity searches. The approach described monitors the outputs of the complex multistep laboratory and bioinformatics process, allows periodic process adjustments, and will have application to quality control of routine bacterial genomics.

Islam J, Ashiru-Oredope D, Budd E, Howard P, Walker AS, Hopkins S, Llewelyn MJ. 2018. A national quality incentive scheme to reduce antibiotic overuse in hospitals: evaluation of perceptions and impact. J Antimicrob Chemother, 73 (6), pp. 1708-1713. | Show Abstract | Read more

Background: In 2016/2017, a financially linked antibiotic prescribing quality improvement initiative Commissioning for Quality and Innovation (AMR-CQUIN) was introduced across acute hospitals in England. This aimed for >1% reductions in DDDs/1000 admissions of total antibiotics, piperacillin/tazobactam and carbapenems compared with 2013/2014 and improved review of empirical antibiotic prescriptions. Objectives: To assess perceptions of staff leading antimicrobial stewardship activity regarding the AMR-CQUIN, the investments made by hospitals to achieve it and how these related to achieving reductions in antibiotic use. Methods: We invited antimicrobial stewardship leads at acute hospitals across England to complete a web-based survey. Antibiotic prescribing data were downloaded from the PHE Antimicrobial Resistance Local Indicators resource. Results: Responses were received from 116/155 (75%) acute hospitals. Owing to yearly increases in antibiotic use, most trusts needed to make >5% reductions in antibiotic consumption to achieve the AMR-CQUIN goal of 1% reduction. Additional funding was made available at 23/113 (20%) trusts and, in 18 (78%), this was <10% of the AMR-CQUIN value. Nationally, the annual trend for increased antibiotic use reversed in 2016/2017. In 2014/2015, year-on-year changes were +3.7% (IQR -0.8%, +8.4%), +9.4% (+0.2%, +19.5%) and +5.8% (-6.2%, +18.2%) for total antibiotics, piperacillin/tazobactam and carbapenems, respectively, and +0.1% (-5.4%, +4.0%), -4.8% (-16.9%, +3.2%) and -8.0% (-20.2%, +4.0%) in 2016/2017. Hospitals where staff believed they could reduce antibiotic use were more likely to do so (P < 0.001). Conclusions: Introducing the AMR-CQUIN was associated with a reduction in antibiotic use. For individual hospitals, achieving the AMR-CQUIN was associated with favourable perceptions of staff and not availability of funding.

Quartagno M, Walker AS, Carpenter JR, Phillips PP, Parmar MK. 2018. Rethinking non-inferiority: a practical trial design for optimising treatment duration. Clin Trials, 15 (5), pp. 477-488. | Show Abstract | Read more

Background Trials to identify the minimal effective treatment duration are needed in different therapeutic areas, including bacterial infections, tuberculosis and hepatitis C. However, standard non-inferiority designs have several limitations, including arbitrariness of non-inferiority margins, choice of research arms and very large sample sizes. Methods We recast the problem of finding an appropriate non-inferior treatment duration in terms of modelling the entire duration-response curve within a pre-specified range. We propose a multi-arm randomised trial design, allocating patients to different treatment durations. We use fractional polynomials and spline-based methods to flexibly model the duration-response curve. We call this a 'Durations design'. We compare different methods in terms of a scaled version of the area between true and estimated prediction curves. We evaluate sensitivity to key design parameters, including sample size, number and position of arms. Results A total sample size of ~ 500 patients divided into a moderate number of equidistant arms (5-7) is sufficient to estimate the duration-response curve within a 5% error margin in 95% of the simulations. Fractional polynomials provide similar or better results than spline-based methods in most scenarios. Conclusion Our proposed practical randomised trial 'Durations design' shows promising performance in the estimation of the duration-response curve; subject to a pending careful investigation of its inferential properties, it provides a potential alternative to standard non-inferiority designs, avoiding many of their limitations, and yet being fairly robust to different possible duration-response curves. The trial outcome is the whole duration-response curve, which may be used by clinicians and policymakers to make informed decisions, facilitating a move away from a forced binary hypothesis testing paradigm.

Kong LY, Eyre DW, Corbeil J, Raymond F, Walker AS, Wilcox MH, Crook DW, Michaud S, Toye B, Frost E et al. 2019. Clostridium difficile: Investigating Transmission Patterns Between Infected and Colonized Patients Using Whole Genome Sequencing. Clin Infect Dis, 68 (2), pp. 204-209. | Show Abstract | Read more

Background: Whole genome sequencing (WGS) studies can enhance our understanding of the role of patients with asymptomatic Clostridium difficile colonization in transmission. Methods: Isolates obtained from patients with Clostridium difficile infection (CDI) and colonization identified in a study conducted during 2006-2007 at 6 Canadian hospitals underwent typing by pulsed-field gel electrophoresis, multilocus sequence typing, and WGS. Isolates from incident CDI cases not in the initial study were also sequenced where possible. Ward movement and typing data were combined to identify plausible donors for each CDI case, as defined by shared time and space within predefined limits. Proportions of plausible donors for CDI cases that were colonized, infected, or both were examined. Results: Five hundred fifty-four isolates were sequenced successfully, 353 from colonized patients and 201 from CDI cases. The NAP1/027/ST1 strain was the most common strain, found in 124 (62%) of infected and 92 (26%) of colonized patients. A donor with a plausible ward link was found for 81 CDI cases (40%) using WGS with a threshold of ≤2 single nucleotide polymorphisms to determine relatedness. Sixty-five (32%) CDI cases could be linked to both infected and colonized donors. Exclusive linkages to infected and colonized donors were found for 28 (14%) and 12 (6%) CDI cases, respectively. Conclusions: Colonized patients contribute to transmission, but CDI cases are more likely linked to other infected patients than colonized patients in this cohort with high rates of the NAP1/027/ST1 strain, highlighting the importance of local prevalence of virulent strains in determining transmission dynamics.

Yang Y, Niehaus KE, Walker TM, Iqbal Z, Walker AS, Wilson DJ, Peto TEA, Crook DW, Smith EG, Zhu T, Clifton DA. 2018. Machine learning for classifying tuberculosis drug-resistance from DNA sequencing data. Bioinformatics, 34 (10), pp. 1666-1671. | Show Abstract | Read more

Motivation: Correct and rapid determination of Mycobacterium tuberculosis (MTB) resistance against available tuberculosis (TB) drugs is essential for the control and management of TB. Conventional molecular diagnostic test assumes that the presence of any well-studied single nucleotide polymorphisms is sufficient to cause resistance, which yields low sensitivity for resistance classification. Summary: Given the availability of DNA sequencing data from MTB, we developed machine learning models for a cohort of 1839 UK bacterial isolates to classify MTB resistance against eight anti-TB drugs (isoniazid, rifampicin, ethambutol, pyrazinamide, ciprofloxacin, moxifloxacin, ofloxacin, streptomycin) and to classify multi-drug resistance. Results: Compared to previous rules-based approach, the sensitivities from the best-performing models increased by 2-4% for isoniazid, rifampicin and ethambutol to 97% (P < 0.01), respectively; for ciprofloxacin and multi-drug resistant TB, they increased to 96%. For moxifloxacin and ofloxacin, sensitivities increased by 12 and 15% from 83 and 81% based on existing known resistance alleles to 95% and 96% (P < 0.01), respectively. Particularly, our models improved sensitivities compared to the previous rules-based approach by 15 and 24% to 84 and 87% for pyrazinamide and streptomycin (P < 0.01), respectively. The best-performing models increase the area-under-the-ROC curve by 10% for pyrazinamide and streptomycin (P < 0.01), and 4-8% for other drugs (P < 0.01). Availability and implementation: The details of source code are provided at http://www.robots.ox.ac.uk/~davidc/code.php. Contact: david.clifton@eng.ox.ac.uk. Supplementary information: Supplementary data are available at Bioinformatics online.

Martin JSH, Eyre DW, Fawley WN, Griffiths D, Davies K, Mawer DPC, Peto TEA, Crook DW, Walker AS, Wilcox MH. 2018. Patient and Strain Characteristics Associated With Clostridium difficile Transmission and Adverse Outcomes. Clin Infect Dis, 67 (9), pp. 1379-1387. | Show Abstract | Read more

Background: No study has used whole-genome sequencing (WGS) to investigate risk factors for Clostridium difficile (CD) transmission between cases, or assessed the impact of recent acquisition on patient outcome. Methods: This 20 month retrospective cohort study included consecutive cytotoxin-positive diarrheal samples, which underwent culture, ribotyping, and WGS (Illumina). Sequenced isolates were compared using single nucleotide variants (SNVs). Independent predictors of acquisition from another case, onward transmission, 120-day recurrence, and 30-day mortality were identified using logistic regression with backwards elimination. Results: Of 660 CD cases, 640 (97%) were sequenced, of which 567 (89%) shared a ribotype with a prior case, but only 227 (35%) were ≤2 SNVs from a prior case, supporting recent acquisition. Plausible (<2 SNVs) recent ward-based acquisition from a symptomatic case was more frequent in certain ribotypes; 64% (67/105) for ribotype-027 cases, compared with 11% (6/57) for ribotype-078. Independent risk factors (adjusted P < .05) for CD acquisition included older age, longer inpatient duration, and ribotype; these factors, and male sex, increased onward transmission. Patients with a plausible donor had a greater risk of recurrence (adjusted P = .001) and trended towards greater 30-day mortality (adjusted P = .06). Ribotype had no additional mortality or recurrence impact after adjusting for acquisition (P > .1). Conclusions: Greater transmission of certain lineages suggests CD may have different reservoirs and modes of transmission. Acquiring CD from a recent case is associated with poorer clinical outcomes. Clinical characteristics associated with increased healthcare-associated CD transmission could be used to target preventative interventions.

Mallewa J, Szubert AJ, Mugyenyi P, Chidziva E, Thomason MJ, Chepkorir P, Abongomera G, Baleeta K, Etyang A, Warambwa C et al. 2018. Effect of ready-to-use supplementary food on mortality in severely immunocompromised HIV-infected individuals in Africa initiating antiretroviral therapy (REALITY): an open-label, parallel-group, randomised controlled trial. Lancet HIV, 5 (5), pp. e231-e240. | Show Abstract | Read more

BACKGROUND: In sub-Saharan Africa, severely immunocompromised HIV-infected individuals have a high risk of mortality during the first few months after starting antiretroviral therapy (ART). We hypothesise that universally providing ready-to-use supplementary food (RUSF) would increase early weight gain, thereby reducing early mortality compared with current guidelines recommending ready-to-use therapeutic food (RUTF) for severely malnourished individuals only. METHODS: We did a 2 × 2 × 2 factorial, open-label, parallel-group trial at inpatient and outpatient facilities in eight urban or periurban regional hospitals in Kenya, Malawi, Uganda, and Zimbabwe. Eligible participants were ART-naive adults and children aged at least 5 years with confirmed HIV infection and a CD4 cell count of fewer than 100 cells per μL, who were initiating ART at the facilities. We randomly assigned participants (1:1) to initiate ART either with (RUSF) or without (no-RUSF) 12 weeks' of peanut-based RUSF containing 1000 kcal per day and micronutrients, given as two 92 g packets per day for adults and one packet (500 kcal per day) for children aged 5-12 years, regardless of nutritional status. In both groups, individuals received supplementation with RUTF only when severely malnourished (ie, body-mass index [BMI] <16-18 kg/m2 or BMI-for-age Z scores <-3 for children). We did the randomisation with computer-generated, sequentially numbered tables with different block sizes incorporated within an online database. Randomisation was stratified by centre, age, and two other factorial randomisations, to 12 week adjunctive raltegravir and enhanced anti-infection prophylaxis (reported elsewhere). Clinic visits were scheduled at weeks 2, 4, 8, 12, 18, 24, 36, and 48, and included nurse assessment of vital status and symptoms and dispensing of all medication including ART and RUSF. The primary outcome was mortality at week 24, analysed by intention to treat. Secondary outcomes included absolute changes in weight, BMI, and mid-upper-arm circumference (MUAC). Safety was analysed in all randomly assigned participants. Follow-up was 48 weeks. This trial is registered with ClinicalTrials.gov (NCT01825031) and the ISRCTN registry (43622374). FINDINGS: Between June 18, 2013, and April 10, 2015, we randomly assigned 1805 participants to treatment: 897 to RUSF and 908 to no-RUSF. 56 (3%) were lost-to-follow-up. 96 (10·9%, 95% CI 9·0-13·1) participants allocated to RUSF and 92 (10·3%, 8·5-12·5) to no-RUSF died within 24 weeks (hazard ratio 1·05, 95% CI 0·79-1·40; log-rank p=0·75), with no evidence of interaction with the other randomisations (both p>0·7). Through 48 weeks, adults and adolescents aged 13 years and older in the RUSF group had significantly greater gains in weight, BMI, and MUAC than the no-RUSF group (p=0·004, 0·004, and 0·03, respectively). The most common type of serious adverse event was specific infections, occurring in 90 (10%) of 897 participants assigned RUSF and 87 (10%) of 908 assigned no-RUSF. By week 48, 205 participants had serious adverse events in both groups (p=0·81), and 181 had grade 4 adverse events in the RUSF group compared with 172 in the non-RUSF group (p=0·45). INTERPRETATION: In severely immunocompromised HIV-infected individuals, providing RUSF universally at ART initiation, compared with providing RUTF to severely malnourished individuals only, improved short-term weight gain but not mortality. A change in policy to provide nutritional supplementation to all severely immunocompromised HIV-infected individuals starting ART is therefore not warranted at present. FUNDING: Joint Global Health Trials Scheme (UK Medical Research Council, UK Department for International Development, and Wellcome Trust).

Eyre DW, Davies KA, Davis G, Fawley WN, Dingle KE, De Maio N, Karas A, Crook DW, Peto TEA, Walker AS et al. 2018. Two Distinct Patterns of Clostridium difficile Diversity Across Europe Indicating Contrasting Routes of Spread. Clin Infect Dis, 67 (7), pp. 1035-1044. | Show Abstract | Read more

Background: Rates of Clostridium difficile infection vary widely across Europe, as do prevalent ribotypes. The extent of Europe-wide diversity within each ribotype, however, is unknown. Methods: Inpatient diarrheal fecal samples submitted on a single day in summer and winter (2012-2013) to laboratories in 482 European hospitals were cultured for C. difficile, and isolates the 10 most prevalent ribotypes were whole-genome sequenced. Within each ribotype, country-based sequence clustering was assessed using the ratio of the median number of single-nucleotide polymorphisms between isolates within versus across different countries, using permutation tests. Time-scaled Bayesian phylogenies were used to reconstruct the historical location of each lineage. Results: Sequenced isolates (n = 624) were from 19 countries. Five ribotypes had within-country clustering: ribotype 356, only in Italy; ribotype 018, predominantly in Italy; ribotype 176, with distinct Czech and German clades; ribotype 001/072, including distinct German, Slovakian, and Spanish clades; and ribotype 027, with multiple predominantly country-specific clades including in Hungary, Italy, Germany, Romania, and Poland. By contrast, we found no within-country clustering for ribotypes 078, 015, 002, 014, and 020, consistent with a Europe-wide distribution. Fluoroquinolone resistance was significantly more common in within-country clustered ribotypes (P = .009). Fluoroquinolone-resistant isolates were also more tightly clustered geographically with a median (interquartile range) of 43 (0-213) miles between each isolate and the most closely genetically related isolate, versus 421 (204-680) miles in nonresistant pairs (P < .001). Conclusions: Two distinct patterns of C. difficile ribotype spread were observed, consistent with either predominantly healthcare-associated acquisition or Europe-wide dissemination via other routes/sources, for example, the food chain.

Siika A, McCabe L, Bwakura-Dangarembizi M, Kityo C, Mallewa J, Berkley J, Maitland K, Griffiths A, Baleeta K, Mudzingwa S et al. 2018. Late Presentation With HIV in Africa: Phenotypes, Risk, and Risk Stratification in the REALITY Trial. Clin Infect Dis, 66 (suppl_2), pp. S140-S146. | Show Abstract | Read more

Background: Severely immunocompromised human immunodeficiency virus (HIV)-infected individuals have high mortality shortly after starting antiretroviral therapy (ART). We investigated predictors of early mortality and "late presenter" phenotypes. Methods: The Reduction of EArly MortaLITY (REALITY) trial enrolled ART-naive adults and children ≥5 years of age with CD4 counts <100 cells/µL initiating ART in Uganda, Zimbabwe, Malawi, and Kenya. Baseline predictors of mortality through 48 weeks were identified using Cox regression with backwards elimination (exit P > .1). Results: Among 1711 included participants, 203 (12%) died. Mortality was independently higher with older age; lower CD4 count, albumin, hemoglobin, and grip strength; presence of World Health Organization stage 3/4 weight loss, fever, or vomiting; and problems with mobility or self-care at baseline (all P < .04). Receiving enhanced antimicrobial prophylaxis independently reduced mortality (P = .02). Of five late-presenter phenotypes, Group 1 (n = 355) had highest mortality (25%; median CD4 count, 28 cells/µL), with high symptom burden, weight loss, poor mobility, and low albumin and hemoglobin. Group 2 (n = 394; 11% mortality; 43 cells/µL) also had weight loss, with high white cell, platelet, and neutrophil counts suggesting underlying inflammation/infection. Group 3 (n = 218; 10% mortality) had low CD4 counts (27 cells/µL), but low symptom burden and maintained fat mass. The remaining groups had 4%-6% mortality. Conclusions: Clinical and laboratory features identified groups with highest mortality following ART initiation. A screening tool could identify patients with low CD4 counts for prioritizing same-day ART initiation, enhanced prophylaxis, and intensive follow-up. Clinical Trials Registration: ISRCTN43622374.

Onakpoya IJ, Walker AS, Tan PS, Spencer EA, Gbinigie OA, Cook J, Llewelyn MJ, Butler CC. 2018. Overview of systematic reviews assessing the evidence for shorter versus longer duration antibiotic treatment for bacterial infections in secondary care. PLoS One, 13 (3), pp. e0194858. | Show Abstract | Read more

Our objective was to assess the clinical effectiveness of shorter versus longer duration antibiotics for treatment of bacterial infections in adults and children in secondary care settings, using the evidence from published systematic reviews. We conducted electronic searches in MEDLINE, Embase, Cochrane, and Cinahl. Our primary outcome was clinical resolution. The quality of included reviews was assessed using the AMSTAR criteria, and the quality of the evidence was rated using the GRADE criteria. We included 6 systematic reviews (n = 3,162). Four reviews were rated high quality, and two of moderate quality. In adults, there was no difference between shorter versus longer duration in clinical resolution rates for peritonitis (RR 1.03, 95% CI 0.98 to 1.09, I2 = 0%), ventilator-associated pneumonia (RR 0.93; 95% CI 0.81 to 1.08, I2 = 24%), or acute pyelonephritis and septic UTI (clinical failure: RR 1.00, 95% CI 0.46 to 2.18). The quality of the evidence was very low to moderate. In children, there was no difference in clinical resolution rates for pneumonia (RR 0.98, 95% CI 0.91 to 1.04, I2 = 48%), pyelonephritis (RR 0.95, 95% CI 0.88 to 1.04) and confirmed bacterial meningitis (RR 1.02, 95% CI 0.93 to 1.11, I2 = 0%). The quality of the evidence was low to moderate. In conclusion, there is currently a limited body of evidence to clearly assess the clinical benefits of shorter versus longer duration antibiotics in secondary care. High quality trials assessing strategies to shorten antibiotic treatment duration for bacterial infections in secondary care settings should now be a priority.

Post FA, Szubert AJ, Prendergast AJ, Johnston V, Lyall H, Fitzgerald F, Musiime V, Musoro G, Chepkorir P, Agutu C et al. 2018. Causes and Timing of Mortality and Morbidity Among Late Presenters Starting Antiretroviral Therapy in the REALITY Trial. Clin Infect Dis, 66 (suppl_2), pp. S132-S139. | Show Abstract | Read more

Background: In sub-Saharan Africa, 20%-25% of people starting antiretroviral therapy (ART) have severe immunosuppression; approximately 10% die within 3 months. In the Reduction of EArly mortaLITY (REALITY) randomized trial, a broad enhanced anti-infection prophylaxis bundle reduced mortality vs cotrimoxazole. We investigate the contribution and timing of different causes of mortality/morbidity. Methods: Participants started ART with a CD4 count <100 cells/µL; enhanced prophylaxis comprised cotrimoxazole plus 12 weeks of isoniazid + fluconazole, single-dose albendazole, and 5 days of azithromycin. A blinded committee adjudicated events and causes of death as (non-mutually exclusively) tuberculosis, cryptococcosis, severe bacterial infection (SBI), other potentially azithromycin-responsive infections, other events, and unknown. Results: Median pre-ART CD4 count was 37 cells/µL. Among 1805 participants, 225 (12.7%) died by week 48. Fatal/nonfatal events occurred early (median 4 weeks); rates then declined exponentially. One hundred fifty-four deaths had single and 71 had multiple causes, including tuberculosis in 4.5% participants, cryptococcosis in 1.1%, SBI in 1.9%, other potentially azithromycin-responsive infections in 1.3%, other events in 3.6%, and unknown in 5.0%. Enhanced prophylaxis reduced deaths from cryptococcosis and unknown causes (P < .05) but not tuberculosis, SBI, potentially azithromycin-responsive infections, or other causes (P > .3); and reduced nonfatal/fatal tuberculosis and cryptococcosis (P < .05), but not SBI, other potentially azithromycin-responsive infections, or other events (P > .2). Conclusions: Enhanced prophylaxis reduced mortality from cryptococcosis and unknown causes and nonfatal tuberculosis and cryptococcosis. High early incidence of fatal/nonfatal events highlights the need for starting enhanced-prophylaxis with ART in advanced disease. Clinical Trials Registration: ISRCTN43622374.

Fowler PW, Cole K, Gordon NC, Kearns AM, Llewelyn MJ, Peto TEA, Crook DW, Walker AS. 2018. Robust Prediction of Resistance to Trimethoprim in Staphylococcus aureus. Cell Chem Biol, 25 (3), pp. 339-349.e4. | Show Abstract | Read more

The rise of antibiotic resistance threatens modern medicine; to combat it new diagnostic methods are required. Sequencing the whole genome of a pathogen offers the potential to accurately determine which antibiotics will be effective to treat a patient. A key limitation of this approach is that it cannot classify rare or previously unseen mutations. Here we demonstrate that alchemical free energy methods, a well-established class of methods from computational chemistry, can successfully predict whether mutations in Staphylococcus aureus dihydrofolate reductase confer resistance to trimethoprim. We also show that the method is quantitatively accurate by calculating how much the most common resistance-conferring mutation, F99Y, reduces the binding free energy of trimethoprim and comparing predicted and experimentally measured minimum inhibitory concentrations for seven different mutations. Finally, by considering up to 32 free energy calculations for each mutation, we estimate its specificity and sensitivity.

Dingle KE, Didelot X, Quan TP, Eyre DW, Stoesser N, Marwick CA, Coia J, Brown D, Buchanan S, Ijaz UZ et al. A role for tetracycline selection in the evolution of Clostridium difficile PCR-ribotype 078 | Show Abstract | Read more

<jats:title>ABSTRACT</jats:title><jats:p>Farm animals have been identified as reservoirs of <jats:italic>Clostridium difficile</jats:italic> PCR-ribotype 078 (RT078). Since 2005, the incidence of human clinical cases (frequently severe), with this genotype has increased. We aimed to understand this change, by studying the recent evolutionary history of RT078. Phylogenetic analysis of international genomes (isolates from 2006–2014) revealed several recent clonal expansions. A common ancestor of each expansion had independently acquired different alleles of the tetracycline resistance gene <jats:italic>tetM</jats:italic>. Consequently, an unusually high proportion of RT078 genomes were <jats:italic>tetM</jats:italic> positive (76.5%). Additional tetracycline resistance determinants were also identified, some for the first time in <jats:italic>C. difficile</jats:italic> (efflux pump <jats:italic>tet40</jats:italic>). Each <jats:italic>tetM</jats:italic>-clonal expansion lacked geographic structure, indicating rapid international spread. Resistance determinants for <jats:italic>C. difficile</jats:italic>-infection-triggering antimicrobials including fluoroquinolones and clindamycin were comparatively rare in RT078. Tetracyclines are used intensively in agriculture; this selective pressure, plus rapid spread via the food-chain may explain the increased RT078 prevalence in humans.</jats:p>

Thwaites GE, Scarborough M, Szubert A, Nsutebu E, Tilley R, Greig J, Wyllie SA, Wilson P, Auckland C, Cairns J et al. 2018. Adjunctive rifampicin for Staphylococcus aureus bacteraemia (ARREST): a multicentre, randomised, double-blind, placebo-controlled trial. Lancet, 391 (10121), pp. 668-678. | Show Abstract | Read more

BACKGROUND: Staphylococcus aureus bacteraemia is a common cause of severe community-acquired and hospital-acquired infection worldwide. We tested the hypothesis that adjunctive rifampicin would reduce bacteriologically confirmed treatment failure or disease recurrence, or death, by enhancing early S aureus killing, sterilising infected foci and blood faster, and reducing risks of dissemination and metastatic infection. METHODS: In this multicentre, randomised, double-blind, placebo-controlled trial, adults (≥18 years) with S aureus bacteraemia who had received ≤96 h of active antibiotic therapy were recruited from 29 UK hospitals. Patients were randomly assigned (1:1) via a computer-generated sequential randomisation list to receive 2 weeks of adjunctive rifampicin (600 mg or 900 mg per day according to weight, oral or intravenous) versus identical placebo, together with standard antibiotic therapy. Randomisation was stratified by centre. Patients, investigators, and those caring for the patients were masked to group allocation. The primary outcome was time to bacteriologically confirmed treatment failure or disease recurrence, or death (all-cause), from randomisation to 12 weeks, adjudicated by an independent review committee masked to the treatment. Analysis was intention to treat. This trial was registered, number ISRCTN37666216, and is closed to new participants. FINDINGS: Between Dec 10, 2012, and Oct 25, 2016, 758 eligible participants were randomly assigned: 370 to rifampicin and 388 to placebo. 485 (64%) participants had community-acquired S aureus infections, and 132 (17%) had nosocomial S aureus infections. 47 (6%) had meticillin-resistant infections. 301 (40%) participants had an initial deep infection focus. Standard antibiotics were given for 29 (IQR 18-45) days; 619 (82%) participants received flucloxacillin. By week 12, 62 (17%) of participants who received rifampicin versus 71 (18%) who received placebo experienced treatment failure or disease recurrence, or died (absolute risk difference -1·4%, 95% CI -7·0 to 4·3; hazard ratio 0·96, 0·68-1·35, p=0·81). From randomisation to 12 weeks, no evidence of differences in serious (p=0·17) or grade 3-4 (p=0·36) adverse events were observed; however, 63 (17%) participants in the rifampicin group versus 39 (10%) in the placebo group had antibiotic or trial drug-modifying adverse events (p=0·004), and 24 (6%) versus six (2%) had drug interactions (p=0·0005). INTERPRETATION: Adjunctive rifampicin provided no overall benefit over standard antibiotic therapy in adults with S aureus bacteraemia. FUNDING: UK National Institute for Health Research Health Technology Assessment.

Pansa P, Hsia Y, Bielicki J, Lutsar I, Walker AS, Sharland M, Folgori L. 2018. Evaluating Safety Reporting in Paediatric Antibiotic Trials, 2000-2016: A Systematic Review and Meta-Analysis. Drugs, 78 (2), pp. 231-244. | Show Abstract | Read more

BACKGROUND: There are very few options to treat multidrug-resistant bacterial infections in children. A major barrier is the duration and complexity of regulatory trials of new antibiotics. Extrapolation of safety data from adult trials could facilitate drug development for children. OBJECTIVE: We performed a systematic review on the safety of antibiotic clinical trials (CTs) in children (0-18 years) to evaluate the overall quality of safety trials conducted in children and to determine if age-specific adverse events (AEs) could be identified for specific antibiotic classes. DATA SOURCES: We searched the MEDLINE, Cochrane CENTRAL, and ClinicalTrials.gov electronic databases for trials conducted between 2000 and 2016. STUDY SELECTION: All trials in which safety was declared a primary or secondary endpoint were included. Exclusion criteria were (1) topical or inhalational route of administration; (2) non-infectious conditions; (3) administration for prophylaxis rather than treatment; (4) selected population (i.e. cystic fibrosis, malignancies, HIV and tuberculosis); and (5) design other than randomized controlled trials. Trials reporting data on both adults and children were included only if paediatric results were reported separately. DATA EXTRACTION AND SYNTHESIS: Two authors independently extracted the data. To assess the quality of published trials, the Extension for harms for Consolidated Standards of Reporting Trials (CONSORT) Statement 2004 was used. MAIN OUTCOME AND MEASURE: In order to quantitatively assess the rate of developing AEs by drug class, the numbers of overall and body-system-specific AEs were collected for each study arm, and then calculated per single drug class as median and interquartile range (IQR) of the proportions across CTs. The AEs most frequently reported were compared in the meta-analysis by selecting the CTs on the most represented drug classes. RESULTS: Eighty-three CTs were included, accounting for 27,693 children. Overall, 69.7% of CONSORT items were fully reported. The median proportion of children with any AE was 22.5%, but did not exceed 8% in any single body system. Serious drug-related AEs and drug-related discontinuations were very rare (median 0.3 and 0.9%, respectively). Limitations included the inability to stratify by age group, particularly neonates. CONCLUSIONS AND RELEVANCE: Overall, AEs in paediatric antibiotic CTs were predictable and class-specific, and no unexpected (age-specific) side effects were identified. Smaller, open-label, dose-finding, high-quality, single-arm pharmacokinetic trials seem potentially sufficient for certain common antibiotic classes, extrapolating well-established safety profiles determined from large adult efficacy trials. This approach could reduce duration and enhance subsequent registration of urgently needed new antibiotics. This will need to be combined with enhanced methods of pharmacovigilance for monitoring of emerging AEs in routine clinical practice.

Farmer RE, Kounali D, Walker AS, Savović J, Richards A, May MT, Ford D. 2018. Application of causal inference methods in the analyses of randomised controlled trials: a systematic review. Trials, 19 (1), pp. 23. | Show Abstract | Read more

BACKGROUND: Applications of causal inference methods to randomised controlled trial (RCT) data have usually focused on adjusting for compliance with the randomised intervention rather than on using RCT data to address other, non-randomised questions. In this paper we review use of causal inference methods to assess the impact of aspects of patient management other than the randomised intervention in RCTs. METHODS: We identified papers that used causal inference methodology in RCT data from Medline, Premedline, Embase, Cochrane Library, and Web of Science from 1986 to September 2014, using a forward citation search of five seminal papers, and a keyword search. We did not include studies where inverse probability weighting was used solely to balance baseline characteristics, adjust for loss to follow-up or adjust for non-compliance to randomised treatment. Studies where the exposure could not be assigned were also excluded. RESULTS: There were 25 papers identified. Nearly half the papers (11/25) estimated the causal effect of concomitant medication on outcome. The remainder were concerned with post-randomisation treatment regimens (sequential treatments, n =5 ), effects of treatment timing (n = 2) and treatment dosing or duration (n = 7). Examples were found in cardiovascular disease (n = 5), HIV (n = 7), cancer (n = 6), mental health (n = 4), paediatrics (n = 2) and transfusion medicine (n = 1). The most common method implemented was a marginal structural model with inverse probability of treatment weighting. CONCLUSIONS: Examples of studies which exploit RCT data to address non-randomised questions using causal inference methodology remain relatively limited, despite the growth in methodological development and increasing utilisation in observational studies. Further efforts may be needed to promote use of causal methods to address additional clinical questions within RCTs to maximise their value.

Young BC, Wu C-H, Gordon NC, Cole K, Price JR, Liu E, Sheppard AE, Perera S, Charlesworth J, Golubchik T et al. 2017. Severe infections emerge from commensal bacteria by adaptive evolution. Elife, 6 | Show Abstract | Read more

Bacteria responsible for the greatest global mortality colonize the human microbiota far more frequently than they cause severe infections. Whether mutation and selection among commensal bacteria are associated with infection is unknown. We investigated de novo mutation in 1163 Staphylococcus aureus genomes from 105 infected patients with nose colonization. We report that 72% of infections emerged from the nose, with infecting and nose-colonizing bacteria showing parallel adaptive differences. We found 2.8-to-3.6-fold adaptive enrichments of protein-altering variants in genes responding to rsp, which regulates surface antigens and toxin production; agr, which regulates quorum-sensing, toxin production and abscess formation; and host-derived antimicrobial peptides. Adaptive mutations in pathogenesis-associated genes were 3.1-fold enriched in infecting but not nose-colonizing bacteria. None of these signatures were observed in healthy carriers nor at the species-level, suggesting infection-associated, short-term, within-host selection pressures. Our results show that signatures of spontaneous adaptive evolution are specifically associated with infection, raising new possibilities for diagnosis and treatment.

Phan HTT, Stoesser N, Maciuca IE, Toma F, Szekely E, Flonta M, Hubbard ATM, Pankhurst L, Do T, Peto TEA et al. 2018. Illumina short-read and MinION long-read WGS to characterize the molecular epidemiology of an NDM-1 Serratia marcescens outbreak in Romania. J Antimicrob Chemother, 73 (3), pp. 672-679. | Show Abstract | Read more

Background and Objectives: Serratia marcescens is an emerging nosocomial pathogen, and the carbapenemase blaNDM has been reported in several surveys in Romania. We aimed to investigate the molecular epidemiology of S. marcescens in two Romanian hospitals over 2010-15, including a neonatal NDM-1 S. marcescens outbreak. Methods: Isolates were sequenced using Illumina technology together with carbapenem-non-susceptible NDM-1-positive and NDM-1-negative Klebsiella pneumoniae and Enterobacter cloacae to provide genomic context. A subset was sequenced with MinION to fully resolve NDM-1 plasmid structures. Resistance genes, plasmid replicons and ISs were identified in silico for all isolates; an annotated phylogeny was reconstructed for S. marcescens. Fully resolved study NDM-1 plasmid sequences were compared with the most closely related publicly available NDM-1 plasmid reference. Results: 44/45 isolates were successfully sequenced (S. marcescens, n = 33; K. pneumoniae, n = 7; E. cloacae, n = 4); 10 with MinION. The S. marcescens phylogeny demonstrated several discrete clusters of NDM-1-positive and -negative isolates. All NDM-1-positive isolates across species harboured a pKOX_NDM1-like plasmid; more detailed comparisons of the plasmid structures demonstrated a number of differences, but highlighted the largely conserved plasmid backbones across species and hospital sites. Conclusions: The molecular epidemiology is most consistent with the importation of a pKOX_NDM1-like plasmid into Romania and its dissemination amongst K. pneumoniae/E. cloacae and subsequently S. marcescens across hospitals. The data suggested multiple acquisitions of this plasmid by S. marcescens in the two hospitals studied; transmission events within centres, including a large outbreak on the Targu Mures neonatal unit; and sharing of the pKOX_NDM1-like plasmid between species within outbreaks.

Szubert AJ, Prendergast AJ, Spyer MJ, Musiime V, Musoke P, Bwakura-Dangarembizi M, Nahirya-Ntege P, Thomason MJ, Ndashimye E, Nkanya I et al. 2017. Virological response and resistance among HIV-infected children receiving long-term antiretroviral therapy without virological monitoring in Uganda and Zimbabwe: Observational analyses within the randomised ARROW trial. PLoS Med, 14 (11), pp. e1002432. | Show Abstract | Read more

BACKGROUND: Although WHO recommends viral load (VL) monitoring for those on antiretroviral therapy (ART), availability in low-income countries remains limited. We investigated long-term VL and resistance in HIV-infected children managed without real-time VL monitoring. METHODS AND FINDINGS: In the ARROW factorial trial, 1,206 children initiating ART in Uganda and Zimbabwe between 15 March 2007 and 18 November 2008, aged a median 6 years old, with median CD4% of 12%, were randomised to monitoring with or without 12-weekly CD4 counts and to receive 2 nucleoside reverse transcriptase inhibitors (2NRTI, mainly abacavir+lamivudine) with a non-nucleoside reverse transcriptase inhibitor (NNRTI) or 3 NRTIs as long-term ART. All children had VL assayed retrospectively after a median of 4 years on ART; those with >1,000 copies/ml were genotyped. Three hundred and sixteen children had VL and genotypes assayed longitudinally (at least every 24 weeks). Overall, 67 (6%) switched to second-line ART and 54 (4%) died. In children randomised to WHO-recommended 2NRTI+NNRTI long-term ART, 308/378 (81%) monitored with CD4 counts versus 297/375 (79%) without had VL <1,000 copies/ml at 4 years (difference = +2.3% [95% CI -3.4% to +8.0%]; P = 0.43), with no evidence of differences in intermediate/high-level resistance to 11 drugs. Among children with longitudinal VLs, only 5% of child-time post-week 24 was spent with persistent low-level viraemia (80-5,000 copies/ml) and 10% with VL rebound ≥5,000 copies/ml. No child resuppressed <80 copies/ml after confirmed VL rebound ≥5,000 copies/ml. A median of 1.0 (IQR 0.0,1.5) additional NRTI mutation accumulated over 2 years' rebound. Nineteen out of 48 (40%) VLs 1,000-5,000 copies/ml were immediately followed by resuppression <1,000 copies/ml, but only 17/155 (11%) VLs ≥5,000 copies/ml resuppressed (P < 0.0001). Main study limitations are that analyses were exploratory and treatment initiation used 2006 criteria, without pre-ART genotypes. CONCLUSIONS: In this study, children receiving first-line ART in sub-Saharan Africa without real-time VL monitoring had good virological and resistance outcomes over 4 years, regardless of CD4 monitoring strategy. Many children with detectable low-level viraemia spontaneously resuppressed, highlighting the importance of confirming virological failure before switching to second-line therapy. Children experiencing rebound ≥5,000 copies/ml were much less likely to resuppress, but NRTI resistance increased only slowly. These results are relevant to the increasing numbers of HIV-infected children receiving first-line ART in sub-Saharan Africa with limited access to virological monitoring. TRIAL REGISTRATION: ISRCTN Registry, ISRCTN24791884.

Quan TP, Bawa Z, Foster D, Walker T, Del Ojo Elias C, Rathod P, MMM Informatics Group, Iqbal Z, Bradley P, Mowbray J et al. 2018. Evaluation of Whole-Genome Sequencing for Mycobacterial Species Identification and Drug Susceptibility Testing in a Clinical Setting: a Large-Scale Prospective Assessment of Performance against Line Probe Assays and Phenotyping. J Clin Microbiol, 56 (2), | Show Abstract | Read more

Use of whole-genome sequencing (WGS) for routine mycobacterial species identification and drug susceptibility testing (DST) is becoming a reality. We compared the performances of WGS and standard laboratory workflows prospectively, by parallel processing at a major mycobacterial reference service over the course of 1 year, for species identification, first-line Mycobacterium tuberculosis resistance prediction, and turnaround time. Among 2,039 isolates with line probe assay results for species identification, 74 (3.6%) failed sequencing or WGS species identification. Excluding these isolates, clinically important species were identified for 1,902 isolates, of which 1,825 (96.0%) were identified as the same species by WGS and the line probe assay. A total of 2,157 line probe test results for detection of resistance to the first-line drugs isoniazid and rifampin were available for 728 M. tuberculosis complex isolates. Excluding 216 (10.0%) cases where there were insufficient sequencing data for WGS to make a prediction, overall concordance was 99.3% (95% confidence interval [CI], 98.9 to 99.6%), sensitivity was 97.6% (91.7 to 99.7%), and specificity was 99.5% (99.0 to 99.7%). A total of 2,982 phenotypic DST results were available for 777 M. tuberculosis complex isolates. Of these, 356 (11.9%) had no WGS comparator due to insufficient sequencing data, and in 154 (5.2%) cases the WGS prediction was indeterminate due to discovery of novel, previously uncharacterized mutations. Excluding these data, overall concordance was 99.2% (98.7 to 99.5%), sensitivity was 94.2% (88.4 to 97.6%), and specificity was 99.4% (99.0 to 99.7%). Median processing times for the routine laboratory tests versus WGS were similar overall, i.e., 20 days (interquartile range [IQR], 15 to 31 days) and 21 days (15 to 29 days), respectively (P = 0.41). In conclusion, WGS predicts species and drug susceptibility with great accuracy, but work is needed to increase the proportion of predictions made.

Hakim JG, Thompson J, Kityo C, Hoppe A, Kambugu A, van Oosterhout JJ, Lugemwa A, Siika A, Mwebaze R, Mweemba A et al. 2018. Lopinavir plus nucleoside reverse-transcriptase inhibitors, lopinavir plus raltegravir, or lopinavir monotherapy for second-line treatment of HIV (EARNEST): 144-week follow-up results from a randomised controlled trial. Lancet Infect Dis, 18 (1), pp. 47-57. | Show Abstract | Read more

BACKGROUND: Millions of HIV-infected people worldwide receive antiretroviral therapy (ART) in programmes using WHO-recommended standardised regimens. Recent WHO guidelines recommend a boosted protease inhibitor plus raltegravir as an alternative second-line combination. We assessed whether this treatment option offers any advantage over the standard protease inhibitor plus two nucleoside reverse-transcriptase inhibitors (NRTIs) second-line combination after 144 weeks of follow-up in typical programme settings. METHODS: We analysed the 144-week outcomes at the completion of the EARNEST trial, a randomised controlled trial done in HIV-infected adults or adolescents in 14 sites in five sub-Saharan African countries (Uganda, Zimbabwe, Malawi, Kenya, Zambia). Participants were those who were no longer responding to non-NRTI-based first-line ART, as assessed with WHO criteria, confirmed by viral-load testing. Participants were randomly assigned to receive a ritonavir-boosted protease inhibitor (lopinavir 400 mg with ritonavir 100 mg, twice per day) plus two or three clinician-selected NRTIs (protease inhibitor plus NRTI group), protease inhibitor plus raltegravir (400 mg twice per day; protease inhibitor plus raltegravir group), or protease inhibitor monotherapy (plus raltegravir induction for first 12 weeks, re-intensified to combination therapy after week 96; protease inhibitor monotherapy group). Randomisation was by computer-generated randomisation sequence, with variable block size. The primary outcome was viral load of less than 400 copies per mL at week 144, for which we assessed non-inferiority with a one-sided α of 0·025, and superiority with a two-sided α of 0·025. The EARNEST trial is registered with ISRCTN, number 37737787. FINDINGS: Between April 12, 2010, and April 29, 2011, 1837 patients were screened for eligibility, of whom 1277 patients were randomly assigned to an intervention group. In the primary (complete-case) analysis at 144 weeks, 317 (86%) of 367 in the protease inhibitor plus NRTI group had viral loads of less than 400 copies per mL compared with 312 (81%) of 383 in the protease inhibitor plus raltegravir group (p=0·07; lower 95% confidence limit for difference 10·2% vs specified non-inferiority margin 10%). In the protease inhibitor monotherapy group, 292 (78%) of 375 had viral loads of less than 400 copies per mL; p=0·003 versus the protease inhibitor plus NRTI group at 144 weeks. There was no difference between groups in serious adverse events, grade 3 or 4 adverse events (total or ART-related), or events that resulted in treatment modification. INTERPRETATION: Protease inhibitor plus raltegravir offered no advantage over protease inhibitor plus NRTI in virological efficacy or safety. In the primary analysis, protease inhibitor plus raltegravir did not meet non-inferiority criteria. A regimen of protease inhibitor with NRTIs remains the best standardised second-line regimen for use in programmes in resource-limited settings. FUNDING: European and Developing Countries Clinical Trials Partnership (EDCTP), UK Medical Research Council, Instituto de Salud Carlos III, Irish Aid, Swedish International Development Cooperation Agency, Instituto Superiore di Sanita, Merck, ViiV Healthcare, WHO.

Martin J, Phan HTT, Findlay J, Stoesser N, Pankhurst L, Navickaite I, De Maio N, Eyre DW, Toogood G, Orsi NM et al. 2017. Covert dissemination of carbapenemase-producing Klebsiella pneumoniae (KPC) in a successfully controlled outbreak: long- and short-read whole-genome sequencing demonstrate multiple genetic modes of transmission. J Antimicrob Chemother, 72 (11), pp. 3025-3034. | Show Abstract | Read more

Background: Carbapenemase-producing Enterobacteriaceae (CPE), including KPC-producing Klebsiella pneumoniae (KPC-Kpn), are an increasing threat to patient safety. Objectives: To use WGS to investigate the extent and complexity of carbapenemase gene dissemination in a controlled KPC outbreak. Materials and methods: Enterobacteriaceae with reduced ertapenem susceptibility recovered from rectal screening swabs/clinical samples, during a 3 month KPC outbreak (2013-14), were investigated for carbapenemase production, antimicrobial susceptibility, variable-number-tandem-repeat profile and WGS [short-read (Illumina), long-read (MinION)]. Short-read sequences were used for MLST and plasmid/Tn4401 fingerprinting, and long-read sequence assemblies for plasmid identification. Phylogenetic analysis used IQTree followed by ClonalFrameML, and outbreak transmission dynamics were inferred using SCOTTI. Results: Twenty patients harboured KPC-positive isolates (6 infected, 14 colonized), and 23 distinct KPC-producing Enterobacteriaceae were identified. Four distinct KPC plasmids were characterized but of 20 KPC-Kpn (from six STs), 17 isolates shared a single pKpQIL-D2 KPC plasmid. All isolates had an identical transposon (Tn4401a), except one KPC-Kpn (ST661) with a single nucleotide variant. A sporadic case of KPC-Kpn (ST491) with Tn4401a-carrying pKpQIL-D2 plasmid was identified 10 months before the outbreak. This plasmid was later seen in two other species and other KPC-Kpn (ST14,ST661) including clonal spread of KPC-Kpn (ST661) from a symptomatic case to nine ward contacts. Conclusions: WGS of outbreak KPC isolates demonstrated blaKPC dissemination via horizontal transposition (Tn4401a), plasmid spread (pKpQIL-D2) and clonal spread (K. pneumoniae ST661). Despite rapid outbreak control, considerable dissemination of blaKPC still occurred among K. pneumoniae and other Enterobacteriaceae, emphasizing its high transmission potential and the need for enhanced control efforts.

Walker AS, Watkinson P, Llewelyn M, Stoesser N, Peto T. 2017. Severity of illness and the weekend effect - Authors' reply. Lancet, 390 (10104), pp. 1735. | Read more

Zhong L-L, Phan HTT, Shen C, Vihta K-D, Sheppard AE, Huang X, Zeng K-J, Li H-Y, Zhang X-F, Patil S et al. 2018. High Rates of Human Fecal Carriage of mcr-1-Positive Multidrug-Resistant Enterobacteriaceae Emerge in China in Association With Successful Plasmid Families. Clin Infect Dis, 66 (5), pp. 676-685. | Show Abstract | Read more

Background: mcr-1-mediated colistin resistance in Enterobacteriaceae is concerning, as colistin is used in treating multidrug-resistant Enterobacteriaceae infections. We identified trends in human fecal mcr-1-positivity rates and colonization with mcr-1-positive, third-generation cephalosporin-resistant (3GC-R) Enterobacteriaceae in Guangzhou, China, and investigated the genetic contexts of mcr-1 in mcr-1-positive 3GC-R strains. Methods: Fecal samples were collected from in-/out-patients submitting specimens to 3 hospitals (2011-2016). mcr-1 carriage trends were assessed using iterative sequential regression. A subset of mcr-1-positive isolates was sequenced (whole-genome sequencing [WGS], Illumina), and genetic contexts (flanking regions, plasmids) of mcr-1 were characterized. Results: Of 8022 fecal samples collected, 497 (6.2%) were mcr-1 positive, and 182 (2.3%) harbored mcr-1-positive 3GC-R Enterobacteriaceae. We observed marked increases in mcr-1 (0% [April 2011] to 31% [March 2016]) and more recent (since January 2014; 0% [April 2011] to 15% [March 2016]) increases in human colonization with mcr-1-positive 3GC-R Enterobacteriaceae (P < .001). mcr-1-positive 3GC-R isolates were commonly multidrug resistant. WGS of mcr-1-positive 3GC-R isolates (70 Escherichia coli, 3 Klebsiella pneumoniae) demonstrated bacterial strain diversity; mcr-1 in association with common plasmid backbones (IncI, IncHI2/HI2A, IncX4) and sometimes in multiple plasmids; frequent mcr-1 chromosomal integration; and high mobility of the mcr-1-associated insertion sequence ISApl1. Sequence data were consistent with plasmid spread among animal/human reservoirs. Conclusions: The high prevalence of mcr-1 in multidrug-resistant E. coli colonizing humans is a clinical threat; diverse genetic mechanisms (strains/plasmids/insertion sequences) have contributed to the dissemination of mcr-1, and will facilitate its persistence.

Lewis J, Payne H, Walker AS, Otwombe K, Gibb DM, Babiker AG, Panchia R, Cotton MF, Violari A, Klein N, Callard RE. 2017. Thymic Output and CD4 T-Cell Reconstitution in HIV-Infected Children on Early and Interrupted Antiretroviral Treatment: Evidence from the Children with HIV Early Antiretroviral Therapy Trial. Front Immunol, 8 (SEP), pp. 1162. | Show Abstract | Read more

OBJECTIVES: Early treatment of HIV-infected children and adults is important for optimal immune reconstitution. Infants' immune systems are more plastic and dynamic than older children's or adults', and deserve particular attention. This study aimed to understand the response of the HIV-infected infant immune system to early antiretroviral therapy (ART) and planned ART interruption and restart. METHODS: Data from HIV-infected children enrolled the CHER trial, starting ART aged between 6 and 12 weeks, were used to explore the effect of ART on immune reconstitution. We used linear and non-linear regression and mixed-effects models to describe children's CD4 trajectories and to identify predictors of CD4 count during early and interrupted ART. RESULTS: Early treatment arrested the decline in CD4 count but did not fully restore it to the levels observed in HIV-uninfected children. Treatment interruption at 40 or 96 weeks resulted in a rapid decline in CD4 T-cells, which on retreatment returned to levels observed before interruption. Naïve CD4 T-cell count was an important determinant of overall CD4 levels. A strong correlation was observed between thymic output and the stable CD4 count both before and after treatment interruption. CONCLUSION: Early identification and treatment of HIV-infected infants is important to stabilize CD4 counts at the highest levels possible. Once stabilized, children's CD4 counts appear resilient, with good potential for recovery following treatment interruption. The naïve T-cell pool and thymic production of naive cells are key determinants of children's CD4 levels.

Stoesser N, Eyre DW, Quan TP, Godwin H, Pill G, Mbuvi E, Vaughan A, Griffiths D, Martin J, Fawley W et al. 2017. Epidemiology of Clostridium difficile in infants in Oxfordshire, UK: Risk factors for colonization and carriage, and genetic overlap with regional C. difficile infection strains. PLoS One, 12 (8), pp. e0182307. | Show Abstract | Read more

BACKGROUND: Approximately 30-40% of children <1 year of age are Clostridium difficile colonized, and may represent a reservoir for adult C. difficile infections (CDI). Risk factors for colonization with toxigenic versus non-toxigenic C. difficile strains and longitudinal acquisition dynamics in infants remain incompletely characterized. METHODS: Predominantly healthy infants (≤2 years) were recruited in Oxfordshire, UK, and provided ≥1 fecal samples. Independent risk factors for toxigenic/non-toxigenic C. difficile colonization and acquisition were identified using multivariable regression. Infant C. difficile isolates were whole-genome sequenced to assay genetic diversity and prevalence of toxin-associated genes, and compared with sequenced strains from Oxfordshire CDI cases. RESULTS: 338/365 enrolled infants provided 1332 fecal samples, representing 158 C. difficile colonization or carriage episodes (107[68%] toxigenic). Initial colonization was associated with age, and reduced with breastfeeding but increased with pet dogs. Acquisition was associated with older age, Caesarean delivery, and diarrhea. Breastfeeding and pre-existing C. difficile colonization reduced acquisition risk. Overall 13% of CDI C. difficile strains were genetically related to infant strains. 29(18%) infant C. difficile sequences were consistent with recent direct/indirect transmission to/from Oxfordshire CDI cases (≤2 single nucleotide variants [SNVs]); 79(50%) shared a common origin with an Oxfordshire CDI case within the last ~5 years (0-10 SNVs). The hypervirulent, epidemic ST1/ribotype 027 remained notably absent in infants in this large study, as did other lineages such as STs 10/44 (ribotype 015); the most common strain in infants was ST2 (ribotype 020/014)(22%). CONCLUSIONS: In predominantly healthy infants without significant healthcare exposure C. difficile colonization and acquisition reflect environmental exposures, with pet dogs identified as a novel risk factor. Genetic overlap between some infant strains and those isolated from CDI cases suggest common community reservoirs of these C. difficile lineages, contrasting with those lineages found only in CDI cases, and therefore more consistent with healthcare-associated spread.

Mramba L, Ngari M, Mwangome M, Muchai L, Bauni E, Walker AS, Gibb DM, Fegan G, Berkley JA. 2017. A growth reference for mid upper arm circumference for age among school age children and adolescents, and validation for mortality: growth curve construction and longitudinal cohort study. BMJ, 358 pp. j3423. | Show Abstract | Read more

Objectives To construct growth curves for mid-upper-arm circumference (MUAC)-for-age z score for 5-19 year olds that accord with the World Health Organization growth standards, and to evaluate their discriminatory performance for subsequent mortality.Design Growth curve construction and longitudinal cohort study.Setting United States and international growth data, and cohorts in Kenya, Uganda, and Zimbabwe.Participants The Health Examination Survey (HES)/National Health and Nutrition Examination Survey (NHANES) US population datasets (age 5-25 years), which were used to construct the 2007 WHO growth reference for body mass index in this age group, were merged with an imputed dataset matching the distribution of the WHO 2006 growth standards age 2-6 years. Validation data were from 685 HIV infected children aged 5-17 years participating in the Antiretroviral Research for Watoto (ARROW) trial in Uganda and Zimbabwe; and 1741 children aged 5-13 years discharged from a rural Kenyan hospital (3.8% HIV infected). Both cohorts were followed-up for survival during one year.Main outcome measures Concordance with WHO 2006 growth standards at age 60 months and survival during one year according to MUAC-for-age and body mass index-for-age z scores.Results The new growth curves transitioned smoothly with WHO growth standards at age 5 years. MUAC-for-age z scores of -2 to -3 and less than-3, compared with -2 or more, was associated with hazard ratios for death within one year of 3.63 (95% confidence interval 0.90 to 14.7; P=0.07) and 11.1 (3.40 to 36.0; P<0.001), respectively, among ARROW trial participants; and 2.22 (1.01 to 4.9; P=0.04) and 5.15 (2.49 to 10.7; P<0.001), respectively, among Kenyan children after discharge from hospital. The AUCs for MUAC-for-age and body mass index-for-age z scores for discriminating subsequent mortality were 0.81 (95% confidence interval 0.70 to 0.92) and 0.75 (0.63 to 0.86) in the ARROW trial (absolute difference 0.06, 95% confidence interval -0.032 to 0.16; P=0.2) and 0.73 (0.65 to 0.80) and 0.58 (0.49 to 0.67), respectively, in Kenya (absolute difference in AUC 0.15, 0.07 to 0.23; P=0.0002).Conclusions The MUAC-for-age z score is at least as effective as the body mass index-for-age z score for assessing mortality risks associated with undernutrition among African school aged children and adolescents. MUAC can provide simplified screening and diagnosis within nutrition and HIV programmes, and in research.

Donker T, Smieszek T, Henderson KL, Johnson AP, Walker AS, Robotham JV. 2017. Measuring distance through dense weighted networks: The case of hospital-associated pathogens. PLoS Comput Biol, 13 (8), pp. e1005622. | Show Abstract | Read more

Hospital networks, formed by patients visiting multiple hospitals, affect the spread of hospital-associated infections, resulting in differences in risks for hospitals depending on their network position. These networks are increasingly used to inform strategies to prevent and control the spread of hospital-associated pathogens. However, many studies only consider patients that are received directly from the initial hospital, without considering the effect of indirect trajectories through the network. We determine the optimal way to measure the distance between hospitals within the network, by reconstructing the English hospital network based on shared patients in 2014-2015, and simulating the spread of a hospital-associated pathogen between hospitals, taking into consideration that each intermediate hospital conveys a delay in the further spread of the pathogen. While the risk of transferring a hospital-associated pathogen between directly neighbouring hospitals is a direct reflection of the number of shared patients, the distance between two hospitals far-away in the network is determined largely by the number of intermediate hospitals in the network. Because the network is dense, most long distance transmission chains in fact involve only few intermediate steps, spreading along the many weak links. The dense connectivity of hospital networks, together with a strong regional structure, causes hospital-associated pathogens to spread from the initial outbreak in a two-step process: first, the directly surrounding hospitals are affected through the strong connections, second all other hospitals receive introductions through the multitude of weaker links. Although the strong connections matter for local spread, weak links in the network can offer ideal routes for hospital-associated pathogens to travel further faster. This hold important implications for infection prevention and control efforts: if a local outbreak is not controlled in time, colonised patients will appear in other regions, irrespective of the distance to the initial outbreak, making import screening ever more difficult.

Eyre DW, Fawley WN, Rajgopal A, Settle C, Mortimer K, Goldenberg SD, Dawson S, Crook DW, Peto TEA, Walker AS, Wilcox MH. 2017. Comparison of Control of Clostridium difficile Infection in Six English Hospitals Using Whole-Genome Sequencing. Clin Infect Dis, 65 (3), pp. 433-441. | Show Abstract | Read more

Background: Variation in Clostridium difficile infection (CDI) rates between healthcare institutions suggests overall incidence could be reduced if the lowest rates could be achieved more widely. Methods: We used whole-genome sequencing (WGS) of consecutive C. difficile isolates from 6 English hospitals over 1 year (2013-14) to compare infection control performance. Fecal samples with a positive initial screen for C. difficile were sequenced. Within each hospital, we estimated the proportion of cases plausibly acquired from previous cases. Results: Overall, 851/971 (87.6%) sequenced samples contained toxin genes, and 451 (46.4%) were fecal-toxin-positive. Of 652 potentially toxigenic isolates >90-days after the study started, 128 (20%, 95% confidence interval [CI] 17-23%) were genetically linked (within ≤2 single nucleotide polymorphisms) to a prior patient's isolate from the previous 90 days. Hospital 2 had the fewest linked isolates, 7/105 (7%, 3-13%), hospital 1, 9/70 (13%, 6-23%), and hospitals 3-6 had similar proportions of linked isolates (22-26%) (P ≤ .002 comparing hospital-2 vs 3-6). Results were similar adjusting for locally circulating ribotypes. Adjusting for hospital, ribotype-027 had the highest proportion of linked isolates (57%, 95% CI 29-81%). Fecal-toxin-positive and toxin-negative patients were similarly likely to be a potential transmission donor, OR = 1.01 (0.68-1.49). There was no association between the estimated proportion of linked cases and testing rates. Conclusions: WGS can be used as a novel surveillance tool to identify varying rates of C. difficile transmission between institutions and therefore to allow targeted efforts to reduce CDI incidence.

Llewelyn MJ, Fitzpatrick JM, Darwin E, SarahTonkin-Crine, Gorton C, Paul J, Peto TEA, Yardley L, Hopkins S, Walker AS. 2017. The antibiotic course has had its day. BMJ, 358 pp. j3418. | Read more

Hakim J, Musiime V, Szubert AJ, Mallewa J, Siika A, Agutu C, Walker S, Pett SL, Bwakura-Dangarembizi M, Lugemwa A et al. 2017. Enhanced Prophylaxis plus Antiretroviral Therapy for Advanced HIV Infection in Africa. N Engl J Med, 377 (3), pp. 233-245. | Show Abstract | Read more

BACKGROUND: In sub-Saharan Africa, among patients with advanced human immunodeficiency virus (HIV) infection, the rate of death from infection (including tuberculosis and cryptococcus) shortly after the initiation of antiretroviral therapy (ART) is approximately 10%. METHODS: In this factorial open-label trial conducted in Uganda, Zimbabwe, Malawi, and Kenya, we enrolled HIV-infected adults and children 5 years of age or older who had not received previous ART and were starting ART with a CD4+ count of fewer than 100 cells per cubic millimeter. They underwent simultaneous randomization to receive enhanced antimicrobial prophylaxis or standard prophylaxis, adjunctive raltegravir or no raltegravir, and supplementary food or no supplementary food. Here, we report on the effects of enhanced antimicrobial prophylaxis, which consisted of continuous trimethoprim-sulfamethoxazole plus at least 12 weeks of isoniazid-pyridoxine (coformulated with trimethoprim-sulfamethoxazole in a single fixed-dose combination tablet), 12 weeks of fluconazole, 5 days of azithromycin, and a single dose of albendazole, as compared with standard prophylaxis (trimethoprim-sulfamethoxazole alone). The primary end point was 24-week mortality. RESULTS: A total of 1805 patients (1733 adults and 72 children or adolescents) underwent randomization to receive either enhanced prophylaxis (906 patients) or standard prophylaxis (899 patients) and were followed for 48 weeks (loss to follow-up, 3.1%). The median baseline CD4+ count was 37 cells per cubic millimeter, but 854 patients (47.3%) were asymptomatic or mildly symptomatic. In the Kaplan-Meier analysis at 24 weeks, the rate of death with enhanced prophylaxis was lower than that with standard prophylaxis (80 patients [8.9% vs. 108 [12.2%]; hazard ratio, 0.73; 95% confidence interval [CI], 0.55 to 0.98; P=0.03); 98 patients (11.0%) and 127 (14.4%), respectively, had died by 48 weeks (hazard ratio, 0.76; 95% CI, 0.58 to 0.99; P=0.04). Patients in the enhanced-prophylaxis group had significantly lower rates of tuberculosis (P=0.02), cryptococcal infection (P=0.01), oral or esophageal candidiasis (P=0.02), death of unknown cause (P=0.03), and new hospitalization (P=0.03). However, there was no significant between-group difference in the rate of severe bacterial infection (P=0.32). There were nonsignificantly lower rates of serious adverse events and grade 4 adverse events in the enhanced-prophylaxis group (P=0.08 and P=0.09, respectively). Rates of HIV viral suppression and adherence to ART were similar in the two groups. CONCLUSIONS: Among HIV-infected patients with advanced immunosuppression, enhanced antimicrobial prophylaxis combined with ART resulted in reduced rates of death at both 24 weeks and 48 weeks without compromising viral suppression or increasing toxic effects. (Funded by the Medical Research Council and others; REALITY Current Controlled Trials number, ISRCTN43622374 .).

Stoesser N, Sheppard AE, Peirano G, Anson LW, Pankhurst L, Sebra R, Phan HTT, Kasarskis A, Mathers AJ, Peto TEA et al. 2017. Genomic epidemiology of global Klebsiella pneumoniae carbapenemase (KPC)-producing Escherichia coli. Sci Rep, 7 (1), pp. 5917. | Show Abstract | Read more

The dissemination of carbapenem resistance in Escherichia coli has major implications for the management of common infections. bla KPC, encoding a transmissible carbapenemase (KPC), has historically largely been associated with Klebsiella pneumoniae, a predominant plasmid (pKpQIL), and a specific transposable element (Tn4401, ~10 kb). Here we characterize the genetic features of bla KPC emergence in global E. coli, 2008-2013, using both long- and short-read whole-genome sequencing. Amongst 43/45 successfully sequenced bla KPC-E. coli strains, we identified substantial strain diversity (n = 21 sequence types, 18% of annotated genes in the core genome); substantial plasmid diversity (≥9 replicon types); and substantial bla KPC-associated, mobile genetic element (MGE) diversity (50% not within complete Tn4401 elements). We also found evidence of inter-species, regional and international plasmid spread. In several cases bla KPC was found on high copy number, small Col-like plasmids, previously associated with horizontal transmission of resistance genes in the absence of antimicrobial selection pressures. E. coli is a common human pathogen, but also a commensal in multiple environmental and animal reservoirs, and easily transmissible. The association of bla KPC with a range of MGEs previously linked to the successful spread of widely endemic resistance mechanisms (e.g. bla TEM, bla CTX-M) suggests that it may become similarly prevalent.

Young BC, Votintseva AA, Foster D, Godwin H, Miller RR, Anson LW, Walker AS, Peto TEA, Crook DW, Knox K. 2017. Multi-site and nasal swabbing for carriage of Staphylococcus aureus: what does a single nose swab predict? J Hosp Infect, 96 (3), pp. 232-237. | Show Abstract | Read more

BACKGROUND: Carriage of Staphylococcus aureus is a risk for infections. Targeted decolonization reduces postoperative infections but depends on accurate screening. AIM: To compare detection of S. aureus carriage in healthy individuals between anatomical sites and nurse- versus self-swabbing; also to determine whether a single nasal swab predicted carriage over four weeks. METHODS: Healthy individuals were recruited via general practices. After consent, nurses performed multi-site swabbing (nose, throat, and axilla). Participants performed nasal swabbing twice-weekly for four weeks. Swabs were returned by mail and cultured for S. aureus. All S. aureus isolates underwent spa typing. Persistent carriage in individuals returning more than three self-swabs was defined as culture of S. aureus from all or all but one self-swabs. FINDINGS: In all, 102 individuals underwent multi-site swabbing; S. aureus carriage was detected from at least one site from 40 individuals (39%). There was no difference between nose (29/102, 28%) and throat (28/102, 27%) isolation rates: the combination increased total detection rate by 10%. Ninety-nine patients returned any self-swab, and 96 returned more than three. Nasal carriage detection was not significantly different on nurse or self-swab [28/99 (74%) vs 26/99 (72%); χ2: P=0.75]. Twenty-two out of 25 participants with first self-swab positive were persistent carriers and 69/71 with first self-swab negative were not, giving high positive predictive value (88%), and very high negative predictive value (97%). CONCLUSION: Nasal swabs detected the majority of carriage; throat swabs increased detection by 10%. Self-taken nasal swabs were equivalent to nurse-taken swabs and predicted persistent nasal carriage over four weeks.

Walker AS, Mason A, Quan TP, Fawcett NJ, Watkinson P, Llewelyn M, Stoesser N, Finney J, Davies J, Wyllie DH et al. 2017. Mortality risks associated with emergency admissions during weekends and public holidays: an analysis of electronic health records. Lancet, 390 (10089), pp. 62-72. | Show Abstract | Read more

BACKGROUND: Weekend hospital admission is associated with increased mortality, but the contributions of varying illness severity and admission time to this weekend effect remain unexplored. METHODS: We analysed unselected emergency admissions to four Oxford University National Health Service hospitals in the UK from Jan 1, 2006, to Dec 31, 2014. The primary outcome was death within 30 days of admission (in or out of hospital), analysed using Cox models measuring time from admission. The primary exposure was day of the week of admission. We adjusted for multiple confounders including demographics, comorbidities, and admission characteristics, incorporating non-linearity and interactions. Models then considered the effect of adjusting for 15 common haematology and biochemistry test results or proxies for hospital workload. FINDINGS: 257 596 individuals underwent 503 938 emergency admissions. 18 313 (4·7%) patients admitted as weekday energency admissions and 6070 (5·1%) patients admitted as weekend emergency admissions died within 30 days (p<0·0001). 9347 individuals underwent 9707 emergency admissions on public holidays. 559 (5·8%) died within 30 days (p<0·0001 vs weekday). 15 routine haematology and biochemistry test results were highly prognostic for mortality. In 271 465 (53·9%) admissions with complete data, adjustment for test results explained 33% (95% CI 21 to 70) of the excess mortality associated with emergency admission on Saturdays compared with Wednesdays, 52% (lower 95% CI 34) on Sundays, and 87% (lower 95% CI 45) on public holidays after adjustment for standard patient characteristics. Excess mortality was predominantly restricted to admissions between 1100 h and 1500 h (pinteraction=0·04). No hospital workload measure was independently associated with mortality (all p values >0·06). INTERPRETATION: Adjustment for routine test results substantially reduced excess mortality associated with emergency admission at weekends and public holidays. Adjustment for patient-level factors not available in our study might further reduce the residual excess mortality, particularly as this clustered around midday at weekends. Hospital workload was not associated with mortality. Together, these findings suggest that the weekend effect arises from patient-level differences at admission rather than reduced hospital staffing or services. FUNDING: NIHR Oxford Biomedical Research Centre.

Price JR, Crook DW, Walker AS, Peto TEA, Llewelyn MJ, Paul J. 2017. Staphylococcus aureus in critical care - Authors' reply. Lancet Infect Dis, 17 (6), pp. 580-581. | Read more

Kekitiinwa A, Szubert AJ, Spyer M, Katuramu R, Musiime V, Mhute T, Bakeera-Kitaka S, Senfuma O, Walker AS, Gibb DM, ARROW Trial Team. 2017. Virologic Response to First-line Efavirenz- or Nevirapine-based Antiretroviral Therapy in HIV-infected African Children. Pediatr Infect Dis J, 36 (6), pp. 588-594. | Show Abstract | Read more

BACKGROUND: Poorer virologic response to nevirapine- versus efavirenz-based antiretroviral therapy (ART) has been reported in adult systematic reviews and pediatric studies. METHODS: We compared drug discontinuation and viral load (VL) response in ART-naïve Ugandan/Zimbabwean children ≥3 years of age initiating ART with clinician-chosen nevirapine versus efavirenz in the ARROW trial. Predictors of suppression <80, <400 and <1000 copies/mL at 36, 48 and 144 weeks were identified using multivariable logistic regression with backwards elimination (P = 0.1). RESULTS: A total of 445 (53%) children received efavirenz and 391 (47%) nevirapine. Children receiving efavirenz were older (median age, 8.6 vs. 7.5 years nevirapine, P < 0.001) and had higher CD4% (12% vs. 10%, P = 0.05), but similar pre-ART VL (P = 0.17). The initial non-nucleoside-reverse-transcriptase-inhibitor (NNRTI) was permanently discontinued for adverse events in 7 of 445 (2%) children initiating efavirenz versus 9 of 391 (2%) initiating nevirapine (P = 0.46); at switch to second line in 17 versus 23, for tuberculosis in 0 versus 26, for pregnancy in 6 versus 0 and for other reasons in 15 versus 5. Early (36-48 weeks) virologic suppression <80 copies/mL was superior with efavirenz, particularly in children with higher pre-ART VL (P = 0.0004); longer-term suppression was superior with nevirapine in older children (P = 0.05). Early suppression was poorer in the youngest and oldest children, regardless of NNRTI (P = 0.02); longer-term suppression was poorer in those with higher pre-ART VL regardless of NNRTI (P = 0.05). Results were broadly similar for <400 and <1000 copies/mL. CONCLUSION: Short-term VL suppression favored efavirenz, but long-term relative performance was age dependent, with better suppression in older children with nevirapine, supporting World Health Organization recommendation that nevirapine remains an alternative NNRTI.

Paton NI, Kityo C, Thompson J, Nankya I, Bagenda L, Hoppe A, Hakim J, Kambugu A, van Oosterhout JJ, Kiconco M et al. 2017. Nucleoside reverse-transcriptase inhibitor cross-resistance and outcomes from second-line antiretroviral therapy in the public health approach: an observational analysis within the randomised, open-label, EARNEST trial. Lancet HIV, 4 (8), pp. e341-e348. | Show Abstract | Read more

BACKGROUND: Cross-resistance after first-line antiretroviral therapy (ART) failure is expected to impair activity of nucleoside reverse-transcriptase inhibitors (NRTIs) in second-line therapy for patients with HIV, but evidence for the effect of cross-resistance on virological outcomes is limited. We aimed to assess the association between the activity, predicted by resistance testing, of the NRTIs used in second-line therapy and treatment outcomes for patients infected with HIV. METHODS: We did an observational analysis of additional data from a published open-label, randomised trial of second-line ART (EARNEST) in sub-Saharan Africa. 1277 adults or adolescents infected with HIV in whom first-line ART had failed (assessed by WHO criteria with virological confirmation) were randomly assigned to a boosted protease inhibitor (standardised to ritonavir-boosted lopinavir) with two to three NRTIs (clinician-selected, without resistance testing); or with raltegravir; or alone as protease inhibitor monotherapy (discontinued after week 96). We tested genotypic resistance on stored baseline samples in patients in the protease inhibitor and NRTI group and calculated the predicted activity of prescribed second-line NRTIs. We measured viral load in stored samples for all patients obtained every 12-16 weeks. This trial is registered with Controlled-Trials.com (number ISRCTN 37737787) and ClinicalTrials.gov (number NCT00988039). FINDINGS: Baseline genotypes were available in 391 (92%) of 426 patients in the protease inhibitor and NRTI group. 176 (89%) of 198 patients prescribed a protease inhibitor with no predicted-active NRTIs had viral suppression (viral load <400 copies per mL) at week 144, compared with 312 (81%) of 383 patients in the protease inhibitor and raltegravir group at week 144 (p=0·02) and 233 (61%) of 280 patients in the protease inhibitor monotherapy group at week 96 (p<0·0001). Compared with results with no active NRTIs, 95 (85%) of 112 patients with one predicted-active NRTI had viral suppression (p=0·3) and 20 (77%) of 26 patients with two or three active NRTIs had viral suppression (p=0·08). Over all follow-up, greater predicted NRTI activity was associated with worse viral load suppression (global p=0·0004). INTERPRETATION: Genotypic resistance testing might not accurately predict NRTI activity in protease inhibitor-based second-line ART. Our results do not support the introduction of routine resistance testing in ART programmes in low-income settings for the purpose of selecting second-line NRTIs. FUNDING: European and Developing Countries Clinical Trials Partnership, UK Medical Research Council, Institito de Salud Carlos III, Irish Aid, Swedish International Development Cooperation Agency, Instituto Superiore di Sanita, WHO, Merck.

Gordon NC, Pichon B, Golubchik T, Wilson DJ, Paul J, Blanc DS, Cole K, Collins J, Cortes N, Cubbon M et al. 2017. Whole-Genome Sequencing Reveals the Contribution of Long-Term Carriers in Staphylococcus aureus Outbreak Investigation. J Clin Microbiol, 55 (7), pp. 2188-2197. | Show Abstract | Read more

Whole-genome sequencing (WGS) makes it possible to determine the relatedness of bacterial isolates at a high resolution, thereby helping to characterize outbreaks. However, for Staphylococcus aureus, the accumulation of within-host diversity during carriage might limit the interpretation of sequencing data. In this study, we hypothesized the converse, namely, that within-host diversity can in fact be exploited to reveal the involvement of long-term carriers (LTCs) in outbreaks. We analyzed WGS data from 20 historical outbreaks and applied phylogenetic methods to assess genetic relatedness and to estimate the time to most recent common ancestor (TMRCA). The findings were compared with the routine investigation results and epidemiological evidence. Outbreaks with epidemiological evidence for an LTC source had a mean estimated TMRCA (adjusted for outbreak duration) of 243 days (95% highest posterior density interval [HPD], 143 to 343 days) compared with 55 days (95% HPD, 28 to 81 days) for outbreaks lacking epidemiological evidence for an LTC (P = 0.004). A threshold of 156 days predicted LTC involvement with a sensitivity of 0.875 and a specificity of 1. We also found 6/20 outbreaks included isolates with differing antimicrobial susceptibility profiles; however, these had only modestly increased pairwise diversity (mean 17.5 single nucleotide variants [SNVs] [95% confidence interval {CI}, 17.3 to 17.8]) compared with isolates with identical antibiograms (12.7 SNVs [95% CI, 12.5 to 12.8]) (P < 0.0001). Additionally, for 2 outbreaks, WGS identified 1 or more isolates that were genetically distinct despite having the outbreak pulsed-field gel electrophoresis (PFGE) pulsotype. The duration-adjusted TMRCA allowed the involvement of LTCs in outbreaks to be identified and could be used to decide whether screening for long-term carriage (e.g., in health care workers) is warranted. Requiring identical antibiograms to trigger investigation could miss important contributors to outbreaks.

Mawer DPC, Eyre DW, Griffiths D, Fawley WN, Martin JSH, Quan TP, Peto TEA, Crook DW, Walker AS, Wilcox MH. 2017. Contribution to Clostridium Difficile Transmission of Symptomatic Patients With Toxigenic Strains Who Are Fecal Toxin Negative. Clin Infect Dis, 64 (9), pp. 1163-1170. | Show Abstract | Read more

Background: The role of symptomatic patients who are toxigenic strain positive (TS+) but fecal toxin negative (FT-) in transmission of Clostridium difficile is currently unknown. Methods: We investigated the contribution of symptomatic TS+/FT- and TS+/FT+ patients in C. difficile transmission in 2 UK regions. From 2-step testing, all glutamate dehydrogenase (GDH)-positive specimens, regardless of fecal toxin result, from Oxford (April 2012 through April 2013) and Leeds (July 2012 through April 2013) microbiology laboratories underwent culture and whole-genome sequencing (WGS), using WGS to identify toxigenic strains. Plausible sources for each TS+/FT+ case, including TS+/FT- and TS+/FT+ patients, were determined using WGS, with and without hospital admission data. Results: A total of 1447 of 12772 (11%) fecal samples were GDH positive, 866 of 1447 (60%) contained toxigenic C. difficile, and fecal toxin was detected in 511 of 866 (59%), representing 235 Leeds and 191 Oxford TS+/FT+ cases. TS+/FT+ cases were 3 times more likely to be plausibly acquired from a previous TS+/FT+ case than a TS+/FT- patient. Fifty-one of 265 (19%) TS+/FT+ cases diagnosed >3 months into the study were genetically related (≤2 single-nucleotide polymorphisms) to ≥1 previous TS+/FT+ case or TS+/FT- patient: 27 (10%) to only TS+/FT+ cases, 9 (3%) to only TS+/FT- patients, and 15 (6%) to both. Only 10 of 265 (4%) were genetically related to a previous TS+/FT+ or TS+/FT- patient and shared the same ward simultaneously or within 28 days. Conclusions: Symptomatic TS+/FT- patients were a source of C. difficile transmission, although they accounted for less onward transmission than TS+/FT+ cases. Although transmission from symptomatic patients with either fecal toxin status accounted for a low overall proportion of new cases, both groups should be infection control targets.

Orlek A, Phan H, Sheppard AE, Doumith M, Ellington M, Peto T, Crook D, Walker AS, Woodford N, Anjum MF, Stoesser N. 2017. Ordering the mob: Insights into replicon and MOB typing schemes from analysis of a curated dataset of publicly available plasmids. Plasmid, 91 pp. 42-52. | Show Abstract | Read more

Plasmid typing can provide insights into the epidemiology and transmission of plasmid-mediated antibiotic resistance. The principal plasmid typing schemes are replicon typing and MOB typing, which utilize variation in replication loci and relaxase proteins respectively. Previous studies investigating the proportion of plasmids assigned a type by these schemes ('typeability') have yielded conflicting results; moreover, thousands of plasmid sequences have been added to NCBI in recent years, without consistent annotation to indicate which sequences represent complete plasmids. Here, a curated dataset of complete Enterobacteriaceae plasmids from NCBI was compiled, and used to assess the typeability and concordance of in silico replicon and MOB typing schemes. Concordance was assessed at hierarchical replicon type resolutions, from replicon family-level to plasmid multilocus sequence type (pMLST)-level, where available. We found that 85% and 65% of the curated plasmids could be replicon and MOB typed, respectively. Overall, plasmid size and the number of resistance genes were significant independent predictors of replicon and MOB typing success. We found some degree of non-concordance between replicon families and MOB types, which was only partly resolved when partitioning plasmids into finer-resolution groups (replicon and pMLST types). In some cases, non-concordance was attributed to ambiguous boundaries between MOBP and MOBQ types; in other cases, backbone mosaicism was considered a more plausible explanation. β-lactamase resistance genes tended not to show fidelity to a particular plasmid type, though some previously reported associations were supported. Overall, replicon and MOB typing schemes are likely to continue playing an important role in plasmid analysis, but their performance is constrained by the diverse and dynamic nature of plasmid genomes.

Eyre DW, Dingle KE, Didelot X, Quan TP, Peto TEA, Wilcox MH, Walker AS, Crook DW. 2017. Clostridium difficile in England: can we stop washing our hands? - Authors' reply. Lancet Infect Dis, 17 (5), pp. 478-479. | Read more

Donker T, Henderson KL, Hopkins KL, Dodgson AR, Thomas S, Crook DW, Peto TEA, Johnson AP, Woodford N, Walker AS, Robotham JV. 2017. The relative importance of large problems far away versus small problems closer to home: insights into limiting the spread of antimicrobial resistance in England. BMC Med, 15 (1), pp. 86. | Show Abstract | Read more

BACKGROUND: To combat the spread of antimicrobial resistance (AMR), hospitals are advised to screen high-risk patients for carriage of antibiotic-resistant bacteria on admission. This often includes patients previously admitted to hospitals with a high AMR prevalence. However, the ability of such a strategy to identify introductions (and hence prevent onward transmission) is unclear, as it depends on AMR prevalence in each hospital, the number of patients moving between hospitals, and the number of hospitals considered 'high risk'. METHODS: We tracked patient movements using data from the National Health Service of England Hospital Episode Statistics and estimated differences in regional AMR prevalences using, as an exemplar, data collected through the national reference laboratory service of Public Health England on carbapenemase-producing Enterobacteriaceae (CPE) from 2008 to 2014. Combining these datasets, we calculated expected CPE introductions into hospitals from across the hospital network to assess the effectiveness of admission screening based on defining high-prevalence hospitals as high risk. RESULTS: Based on numbers of exchanged patients, the English hospital network can be divided into 14 referral regions. England saw a sharp increase in numbers of CPE isolates referred to the national reference laboratory over 7 years, from 26 isolates in 2008 to 1649 in 2014. Large regional differences in numbers of confirmed CPE isolates overlapped with regional structuring of patient movements between hospitals. However, despite these large differences in prevalence between regions, we estimated that hospitals received only a small proportion (1.8%) of CPE-colonised patients from hospitals outside their own region, which decreased over time. CONCLUSIONS: In contrast to the focus on import screening based on assigning a few hospitals as 'high risk', patient transfers between hospitals with small AMR problems in the same region often pose a larger absolute threat than patient transfers from hospitals in other regions with large problems, even if the prevalence in other regions is orders of magnitude higher. Because the difference in numbers of exchanged patients, between and within regions, was mostly larger than the difference in CPE prevalence, it would be more effective for hospitals to focus on their own populations or region to inform control efforts rather than focussing on problems elsewhere.

Grady C, Touloumi G, Walker AS, Smolskis M, Sharma S, Babiker AG, Pantazis N, Tavel J, Florence E, Sanchez A et al. 2017. A randomized trial comparing concise and standard consent forms in the START trial. PLoS One, 12 (4), pp. e0172607. | Show Abstract | Read more

BACKGROUND: Improving the effectiveness and efficiency of research informed consent is a high priority. Some express concern about longer, more complex, written consent forms creating barriers to participant understanding. A recent meta-analysis concluded that randomized comparisons were needed. METHODS: We conducted a cluster-randomized non-inferiority comparison of a standard versus concise consent form within a multinational trial studying the timing of starting antiretroviral therapy in HIV+ adults (START). Interested sites were randomized to standard or concise consent forms for all individuals signing START consent. Participants completed a survey measuring comprehension of study information and satisfaction with the consent process. Site personnel reported usual site consent practices. The primary outcome was comprehension of the purpose of randomization (pre-specified 7.5% non-inferiority margin). RESULTS: 77 sites (2429 participants) were randomly allocated to use standard consent and 77 sites (2000 participants) concise consent, for an evaluable cohort of 4229. Site and participant characteristics were similar for the two groups. The concise consent was non-inferior to the standard consent on comprehension of randomization (80.2% versus 82%, site adjusted difference: 0.75% (95% CI -3.8%, +5.2%)); and the two groups did not differ significantly on total comprehension score, satisfaction, or voluntariness (p>0.1). Certain independent factors, such as education, influenced comprehension and satisfaction but not differences between consent groups. CONCLUSIONS: An easier to read, more concise consent form neither hindered nor improved comprehension of study information nor satisfaction with the consent process among a large number of participants. This supports continued efforts to make consent forms more efficient. TRIAL REGISTRATION: Informed consent substudy was registered as part of START study in clinicaltrials.gov #NCT00867048, and EudraCT # 2008-006439-12.

Bienczak A, Denti P, Cook A, Wiesner L, Mulenga V, Kityo C, Kekitiinwa A, Gibb DM, Burger D, Walker AS, McIlleron H. 2017. Determinants of virological outcome and adverse events in African children treated with paediatric nevirapine fixed-dose-combination tablets. AIDS, 31 (7), pp. 905-915. | Show Abstract | Read more

BACKGROUND: Nevirapine is the only nonnucleoside reverse transcriptase inhibitor currently available as a paediatric fixed-dose-combination tablet and is widely used in African children. Nonetheless, the number of investigations into pharmacokinetic determinants of virological suppression in African children is limited, and the predictive power of the current therapeutic range was never evaluated in this population, thereby limiting treatment optimization. METHODS: We analysed data from 322 African children (aged 0.3-13 years) treated with nevirapine, lamivudine, and either abacavir, stavudine, or zidovudine, and followed up to 144 weeks. Nevirapine trough concentration (Cmin) and other factors were tested for associations with viral load more than 100 copies/ml and transaminase increases more than grade 1 using proportional hazard and logistic models in 219 initially antiretroviral treatment (ART)-naive children. RESULTS: Pre-ART viral load, adherence, and nevirapine Cmin were associated with viral load nonsuppression [hazard ratio = 2.08 (95% confidence interval (CI): 1.50-2.90, P < 0.001) for 10-fold higher pre-ART viral load, hazard ratio = 0.78 (95% CI: 0.68-0.90, P < 0.001) for 10% improvement in adherence, and hazard ratio = 0.94 (95% CI: 0.90-0.99, P = 0.014) for a 1 mg/l increase in nevirapine Cmin]. There were additional effects of pre-ART CD4 cell percentage and clinical site. The risk of virological nonsuppression decreased with increasing nevirapine Cmin, and there was no clear Cmin threshold predictive of virological nonsuppression. Transient transaminase elevations more than grade 1 were associated with high Cmin (>12.4 mg/l), hazard ratio = 5.18 (95% CI 1.95-13.80, P < 0.001). CONCLUSION: Treatment initiation at lower pre-ART viral load and higher pre-ART CD4 cell percentage, increased adherence, and maintaining average Cmin higher than current target could improve virological suppression of African children treated with nevirapine without increasing toxicity.

Orlek A, Phan H, Sheppard AE, Doumith M, Ellington M, Peto T, Crook D, Walker AS, Woodford N, Anjum MF, Stoesser N. 2017. A curated dataset of complete Enterobacteriaceae plasmids compiled from the NCBI nucleotide database. Data Brief, 12 pp. 423-426. | Show Abstract | Read more

Thousands of plasmid sequences are now publicly available in the NCBI nucleotide database, but they are not reliably annotated to distinguish complete plasmids from plasmid fragments, such as gene or contig sequences; therefore, retrieving complete plasmids for downstream analyses is challenging. Here we present a curated dataset of complete bacterial plasmids from the clinically relevant Enterobacteriaceae family. The dataset was compiled from the NCBI nucleotide database using curation steps designed to exclude incomplete plasmid sequences, and chromosomal sequences misannotated as plasmids. Over 2000 complete plasmid sequences are included in the curated plasmid dataset. Protein sequences produced from translating each complete plasmid nucleotide sequence in all 6 frames are also provided. Further analysis and discussion of the dataset is presented in an accompanying research article: "Ordering the mob: insights into replicon and MOB typing…" (Orlek et al., 2017) [1]. The curated plasmid sequences are publicly available in the Figshare repository.

Eyre DW, De Silva D, Cole K, Peters J, Cole MJ, Grad YH, Demczuk W, Martin I, Mulvey MR, Crook DW et al. 2017. WGS to predict antibiotic MICs for Neisseria gonorrhoeae. J Antimicrob Chemother, 72 (7), pp. 1937-1947. | Show Abstract | Read more

Background: Tracking the spread of antimicrobial-resistant Neisseria gonorrhoeae is a major priority for national surveillance programmes. Objectives: We investigate whether WGS and simultaneous analysis of multiple resistance determinants can be used to predict antimicrobial susceptibilities to the level of MICs in N. gonorrhoeae. Methods: WGS was used to identify previously reported potential resistance determinants in 681 N. gonorrhoeae isolates, from England, the USA and Canada, with phenotypes for cefixime, penicillin, azithromycin, ciprofloxacin and tetracycline determined as part of national surveillance programmes. Multivariate linear regression models were used to identify genetic predictors of MIC. Model performance was assessed using leave-one-out cross-validation. Results: Overall 1785/3380 (53%) MIC values were predicted to the nearest doubling dilution and 3147 (93%) within ±1 doubling dilution and 3314 (98%) within ±2 doubling dilutions. MIC prediction performance was similar across the five antimicrobials tested. Prediction models included the majority of previously reported resistance determinants. Applying EUCAST breakpoints to MIC predictions, the overall very major error (VME; phenotypically resistant, WGS-prediction susceptible) rate was 21/1577 (1.3%, 95% CI 0.8%-2.0%) and the major error (ME; phenotypically susceptible, WGS-prediction resistant) rate was 20/1186 (1.7%, 1.0%-2.6%). VME rates met regulatory thresholds for all antimicrobials except cefixime and ME rates for all antimicrobials except tetracycline. Country of testing was a strongly significant predictor of MIC for all five antimicrobials. Conclusions: We demonstrate a WGS-based MIC prediction approach that allows reliable MIC prediction for five gonorrhoea antimicrobials. Our approach should allow reasonably precise prediction of MICs for a range of bacterial species.

Votintseva AA, Bradley P, Pankhurst L, Del Ojo Elias C, Loose M, Nilgiriwala K, Chatterjee A, Smith EG, Sanderson N, Walker TM et al. 2017. Same-Day Diagnostic and Surveillance Data for Tuberculosis via Whole-Genome Sequencing of Direct Respiratory Samples. J Clin Microbiol, 55 (5), pp. 1285-1298. | Show Abstract | Read more

Routine full characterization of Mycobacterium tuberculosis is culture based, taking many weeks. Whole-genome sequencing (WGS) can generate antibiotic susceptibility profiles to inform treatment, augmented with strain information for global surveillance; such data could be transformative if provided at or near the point of care. We demonstrate a low-cost method of DNA extraction directly from patient samples for M. tuberculosis WGS. We initially evaluated the method by using the Illumina MiSeq sequencer (40 smear-positive respiratory samples obtained after routine clinical testing and 27 matched liquid cultures). M. tuberculosis was identified in all 39 samples from which DNA was successfully extracted. Sufficient data for antibiotic susceptibility prediction were obtained from 24 (62%) samples; all results were concordant with reference laboratory phenotypes. Phylogenetic placement was concordant between direct and cultured samples. With Illumina MiSeq/MiniSeq, the workflow from patient sample to results can be completed in 44/16 h at a reagent cost of £96/£198 per sample. We then employed a nonspecific PCR-based library preparation method for sequencing on an Oxford Nanopore Technologies MinION sequencer. We applied this to cultured Mycobacterium bovis strain BCG DNA and to combined culture-negative sputum DNA and BCG DNA. For flow cell version R9.4, the estimated turnaround time from patient to identification of BCG, detection of pyrazinamide resistance, and phylogenetic placement was 7.5 h, with full susceptibility results 5 h later. Antibiotic susceptibility predictions were fully concordant. A critical advantage of MinION is the ability to continue sequencing until sufficient coverage is obtained, providing a potential solution to the problem of variable amounts of M. tuberculosis DNA in direct samples.

Kizny Gordon AE, Mathers AJ, Cheong EYL, Gottlieb T, Kotay S, Walker AS, Peto TEA, Crook DW, Stoesser N. 2017. The Hospital Water Environment as a Reservoir for Carbapenem-Resistant Organisms Causing Hospital-Acquired Infections-A Systematic Review of the Literature. Clin Infect Dis, 64 (10), pp. 1435-1444. | Show Abstract | Read more

Over the last 20 years there have been 32 reports of carbapenem-resistant organisms in the hospital water environment, with half of these occurring since 2010. The majority of these reports have described associated clinical outbreaks in the intensive care setting, affecting the critically ill and the immunocompromised. Drains, sinks, and faucets were most frequently colonized, and Pseudomonas aeruginosa the predominant organism. Imipenemase (IMP), Klebsiella pneumoniae carbapenemase (KPC), and Verona integron-encoded metallo-β-lactamase (VIM) were the most common carbapenemases found. Molecular typing was performed in almost all studies, with pulse field gel electrophoresis being most commonly used. Seventy-two percent of studies reported controlling outbreaks, of which just more than one-third eliminated the organism from the water environment. A combination of interventions seems to be most successful, including reinforcement of general infection control measures, alongside chemical disinfection. The most appropriate disinfection method remains unclear, however, and it is likely that replacement of colonized water reservoirs may be required for long-term clearance.

Orlek A, Stoesser N, Anjum MF, Doumith M, Ellington MJ, Peto T, Crook D, Woodford N, Walker AS, Phan H, Sheppard AE. 2017. Plasmid Classification in an Era of Whole-Genome Sequencing: Application in Studies of Antibiotic Resistance Epidemiology. Front Microbiol, 8 (FEB), pp. 182. | Show Abstract | Read more

Plasmids are extra-chromosomal genetic elements ubiquitous in bacteria, and commonly transmissible between host cells. Their genomes include variable repertoires of 'accessory genes,' such as antibiotic resistance genes, as well as 'backbone' loci which are largely conserved within plasmid families, and often involved in key plasmid-specific functions (e.g., replication, stable inheritance, mobility). Classifying plasmids into different types according to their phylogenetic relatedness provides insight into the epidemiology of plasmid-mediated antibiotic resistance. Current typing schemes exploit backbone loci associated with replication (replicon typing), or plasmid mobility (MOB typing). Conventional PCR-based methods for plasmid typing remain widely used. With the emergence of whole-genome sequencing (WGS), large datasets can be analyzed using in silico plasmid typing methods. However, short reads from popular high-throughput sequencers can be challenging to assemble, so complete plasmid sequences may not be accurately reconstructed. Therefore, localizing resistance genes to specific plasmids may be difficult, limiting epidemiological insight. Long-read sequencing will become increasingly popular as costs decline, especially when resolving accurate plasmid structures is the primary goal. This review discusses the application of plasmid classification in WGS-based studies of antibiotic resistance epidemiology; novel in silico plasmid analysis tools are highlighted. Due to the diverse and plastic nature of plasmid genomes, current typing schemes do not classify all plasmids, and identifying conserved, phylogenetically concordant genes for subtyping and phylogenetics is challenging. Analyzing plasmids as nodes in a network that represents gene-sharing relationships between plasmids provides a complementary way to assess plasmid diversity, and allows inferences about horizontal gene transfer to be made.

Price JR, Cole K, Bexley A, Kostiou V, Eyre DW, Golubchik T, Wilson DJ, Crook DW, Walker AS, Peto TEA et al. 2017. Transmission of Staphylococcus aureus between health-care workers, the environment, and patients in an intensive care unit: a longitudinal cohort study based on whole-genome sequencing. Lancet Infect Dis, 17 (2), pp. 207-214. | Show Abstract | Read more

BACKGROUND: Health-care workers have been implicated in nosocomial outbreaks of Staphylococcus aureus, but the dearth of evidence from non-outbreak situations means that routine health-care worker screening and S aureus eradication are controversial. We aimed to determine how often S aureus is transmitted from health-care workers or the environment to patients in an intensive care unit (ICU) and a high-dependency unit (HDU) where standard infection control measures were in place. METHODS: In this longitudinal cohort study, we systematically sampled health-care workers, the environment, and patients over 14 months at the ICU and HDU of the Royal Sussex County Hospital, Brighton, England. Nasal swabs were taken from health-care workers every 4 weeks, bed spaces were sampled monthly, and screening swabs were obtained from patients at admission to the ICU or HDU, weekly thereafter, and at discharge. Isolates were cultured and their whole genome sequenced, and we used the threshold of 40 single-nucleotide variants (SNVs) or fewer to define subtypes and infer recent transmission. FINDINGS: Between Oct 31, 2011, and Dec 23, 2012, we sampled 198 health-care workers, 40 environmental locations, and 1854 patients; 1819 isolates were sequenced. Median nasal carriage rate of S aureus in health-care workers at 4-weekly timepoints was 36·9% (IQR 35·7-37·3), and 115 (58%) health-care workers had S aureus detected at least once during the study. S aureus was identified in 8-50% of environmental samples. 605 genetically distinct subtypes were identified (median SNV difference 273, IQR 162-399) at a rate of 38 (IQR 34-42) per 4-weekly cycle. Only 25 instances of transmission to patients (seven from health-care workers, two from the environment, and 16 from other patients) were detected. INTERPRETATION: In the presence of standard infection control measures, health-care workers were infrequently sources of transmission to patients. S aureus epidemiology in the ICU and HDU is characterised by continuous ingress of distinct subtypes rather than transmission of genetically related strains. FUNDING: UK Medical Research Council, Wellcome Trust, Biotechnology and Biological Sciences Research Council, UK National Institute for Health Research, and Public Health England.

Kityo C, Thompson J, Nankya I, Hoppe A, Ndashimye E, Warambwa C, Mambule I, van Oosterhout JJ, Wools-Kaloustian K, Bertagnolio S et al. 2017. HIV Drug Resistance Mutations in Non-B Subtypes After Prolonged Virological Failure on NNRTI-Based First-Line Regimens in Sub-Saharan Africa. J Acquir Immune Defic Syndr, 75 (2), pp. e45-e54. | Show Abstract | Read more

OBJECTIVE: To determine drug resistance mutation (DRM) patterns in a large cohort of patients failing nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line antiretroviral therapy regimens in programs without routine viral load (VL) monitoring and to examine intersubtype differences in DRMs. DESIGN: Sequences from 787 adults/adolescents who failed an NNRTI-based first-line regimen in 13 clinics in Uganda, Kenya, Zimbabwe, and Malawi were analyzed. Multivariable logistic regression was used to determine the association between specific DRMs and Stanford intermediate-/high-level resistance and factors including REGA subtype, first-line antiretroviral therapy drugs, CD4, and VL at failure. RESULTS: The median first-line treatment duration was 4 years (interquartile range 30-43 months); 42% of participants had VL ≥100,000 copies/mL and 63% participants had CD4 <100 cells/mm. Viral subtype distribution was A1 (40%; Uganda and Kenya), C (31%; Zimbabwe and Malawi), and D (25%; Uganda and Kenya), and recombinant/unclassified (5%). In general, DRMs were more common in subtype-C than in subtype-A and/or subtype-D (nucleoside reverse transcriptase inhibitor mutations K65R and Q151M; NNRTI mutations E138A, V106M, Y181C, K101E, and H221Y). The presence of tenofovir resistance was similar between subtypes [P (adjusted) = 0.32], but resistance to zidovudine, abacavir, etravirine, or rilpivirine was more common in subtype-C than in subtype-D/subtype-A [P (adjusted) < 0.02]. CONCLUSIONS: Non-B subtypes differ in DRMs at first-line failure, which impacts on residual nucleoside reverse transcriptase inhibitor and NNRTI susceptibility. In particular, higher rates of etravirine and rilpivirine resistance in subtype-C may limit their potential utility in salvage regimens.

Dingle KE, Didelot X, Quan TP, Eyre DW, Stoesser N, Golubchik T, Harding RM, Wilson DJ, Griffiths D, Vaughan A et al. 2017. Effects of control interventions on Clostridium difficile infection in England: an observational study. Lancet Infect Dis, 17 (4), pp. 411-421. | Show Abstract | Read more

BACKGROUND: The control of Clostridium difficile infections is an international clinical challenge. The incidence of C difficile in England declined by roughly 80% after 2006, following the implementation of national control policies; we tested two hypotheses to investigate their role in this decline. First, if C difficile infection declines in England were driven by reductions in use of particular antibiotics, then incidence of C difficile infections caused by resistant isolates should decline faster than that caused by susceptible isolates across multiple genotypes. Second, if C difficile infection declines were driven by improvements in hospital infection control, then transmitted (secondary) cases should decline regardless of susceptibility. METHODS: Regional (Oxfordshire and Leeds, UK) and national data for the incidence of C difficile infections and antimicrobial prescribing data (1998-2014) were combined with whole genome sequences from 4045 national and international C difficile isolates. Genotype (multilocus sequence type) and fluoroquinolone susceptibility were determined from whole genome sequences. The incidence of C difficile infections caused by fluoroquinolone-resistant and fluoroquinolone-susceptible isolates was estimated with negative-binomial regression, overall and per genotype. Selection and transmission were investigated with phylogenetic analyses. FINDINGS: National fluoroquinolone and cephalosporin prescribing correlated highly with incidence of C difficile infections (cross-correlations >0·88), by contrast with total antibiotic prescribing (cross-correlations <0·59). Regionally, C difficile decline was driven by elimination of fluoroquinolone-resistant isolates (approximately 67% of Oxfordshire infections in September, 2006, falling to approximately 3% in February, 2013; annual incidence rate ratio 0·52, 95% CI 0·48-0·56 vs fluoroquinolone-susceptible isolates: 1·02, 0·97-1·08). C difficile infections caused by fluoroquinolone-resistant isolates declined in four distinct genotypes (p<0·01). The regions of phylogenies containing fluoroquinolone-resistant isolates were short-branched and geographically structured, consistent with selection and rapid transmission. The importance of fluoroquinolone restriction over infection control was shown by significant declines in inferred secondary (transmitted) cases caused by fluoroquinolone-resistant isolates with or without hospital contact (p<0·0001) versus no change in either group of cases caused by fluoroquinolone-susceptible isolates (p>0·2). INTERPRETATION: Restricting fluoroquinolone prescribing appears to explain the decline in incidence of C difficile infections, above other measures, in Oxfordshire and Leeds, England. Antimicrobial stewardship should be a central component of C difficile infection control programmes. FUNDING: UK Clinical Research Collaboration (Medical Research Council, Wellcome Trust, National Institute for Health Research); NIHR Oxford Biomedical Research Centre; NIHR Health Protection Research Unit on Healthcare Associated Infection and Antimicrobial Resistance (Oxford University in partnership with Public Health England [PHE]), and on Modelling Methodology (Imperial College, London in partnership with PHE); and the Health Innovation Challenge Fund.

Bienczak A, Cook A, Wiesner L, Mulenga V, Kityo C, Kekitiinwa A, Walker AS, Owen A, Gibb DM, Burger D et al. 2017. Effect of diurnal variation, CYP2B6 genotype and age on the pharmacokinetics of nevirapine in African children. J Antimicrob Chemother, 72 (1), pp. 190-199. | Show Abstract | Read more

OBJECTIVES: To characterize the effects of CYP2B6 polymorphisms, diurnal variation and demographic factors on nevirapine pharmacokinetics in African children. METHODS: Non-linear mixed-effects modelling conducted in NONMEM 7.3 described nevirapine plasma concentration-time data from 414 children aged 0.3-15 years. RESULTS: Nevirapine pharmacokinetics was best described using a one-compartment disposition model with elimination through a well-stirred liver model accounting for a first-pass effect and transit-compartment absorption. Intrinsic clearance was affected by diurnal variation (characterized using a cosine function with peak amplitude 29% at 12 noon) and CYP2B6 metabolizer status [extensive metabolizer (EM) 516GG|983TT, reference; intermediate metabolizer (IM) 516GT|983TT or 516GG|983TC, 17% lower; slow metabolizer (SM) 516TT|983TT or 516GT|983TC, 50% lower; ultra-slow metabolizer (USM) 516GG|983CC, 68% lower]. Age was found to affect pre-hepatic bioavailability: 31.7% lower at birth and increasing exponentially. Median (90% CI) evening Cmin values in the different metabolizer groups were 5.01 (3.01-7.47), 6.55 (3.65-13.32), 11.59 (5.44-22.71) and 12.32 (12.32-27.25) mg/L, respectively. Evening Cmin values were <3 mg/L in 43% of EM weighing <6 kg and 26% of IM weighing <6 kg, while 73% of SM and 88% of USM in all weight-bands had evening Cmin values >8 mg/L. Cmin was not markedly affected by administration time, but was altered by unequal splitting of the daily dose. CONCLUSIONS: Diurnal variation does not greatly affect nevirapine exposure. However, when daily doses cannot be split equally, the larger dose should be given in the morning. To achieve homogeneous exposures, nevirapine doses for SM and USM should be reduced by 50%, and children weighing <6 kg with EM or IM metabolizer status should receive the same dose as children weighing 6-10 kg.

Mathers AJ, Stoesser N, Chai W, Carroll J, Barry K, Cherunvanky A, Sebra R, Kasarskis A, Peto TE, Walker AS et al. 2017. Chromosomal Integration of the Klebsiella pneumoniae Carbapenemase Gene, blaKPC, in Klebsiella Species Is Elusive but Not Rare. Antimicrob Agents Chemother, 61 (3), | Show Abstract | Read more

Carbapenemase genes in Enterobacteriaceae are mostly described as being plasmid associated. However, the genetic context of carbapenemase genes is not always confirmed in epidemiological surveys, and the frequency of their chromosomal integration therefore is unknown. A previously sequenced collection of blaKPC-positive Enterobacteriaceae from a single U.S. institution (2007 to 2012; n = 281 isolates from 182 patients) was analyzed to identify chromosomal insertions of Tn4401, the transposon most frequently harboring blaKPC Using a combination of short- and long-read sequencing, we confirmed five independent chromosomal integration events from 6/182 (3%) patients, corresponding to 15/281 (5%) isolates. Three patients had isolates identified by perirectal screening, and three had infections which were all successfully treated. When a single copy of blaKPC was in the chromosome, one or both of the phenotypic carbapenemase tests were negative. All chromosomally integrated blaKPC genes were from Klebsiella spp., predominantly K. pneumoniae clonal group 258 (CG258), even though these represented only a small proportion of the isolates. Integration occurred via IS15-ΔI-mediated transposition of a larger, composite region encompassing Tn4401 at one locus of chromosomal integration, seen in the same strain (K. pneumoniae ST340) in two patients. In summary, we identified five independent chromosomal integrations of blaKPC in a large outbreak, demonstrating that this is not a rare event. blaKPC was more frequently integrated into the chromosome of epidemic CG258 K. pneumoniae lineages (ST11, ST258, and ST340) and was more difficult to detect by routine phenotypic methods in this context. The presence of chromosomally integrated blaKPC within successful, globally disseminated K. pneumoniae strains therefore is likely underestimated.

Prendergast AJ, Bwakura-Dangarembizi M, Mugyenyi P, Lutaakome J, Kekitiinwa A, Thomason MJ, Gibb DM, Walker AS, ARROW Trial Team. 2016. Reduced bacterial skin infections in HIV-infected African children randomized to long-term cotrimoxazole prophylaxis. AIDS, 30 (18), pp. 2823-2829. | Show Abstract | Read more

OBJECTIVE: To evaluate whether cotrimoxazole prophylaxis prevents common skin conditions in HIV-infected children. DESIGN: Open-label randomized controlled trial of continuing versus stopping daily cotrimoxazole (post-hoc analysis). SETTING: Three sites in Uganda and one in Zimbabwe. PARTICIPANTS: A total of 758 children aged more than 3 years receiving antiretroviral therapy (ART) for more than 96 weeks in the ARROW trial were randomized to stop (n = 382) or continue (n = 376) cotrimoxazole after median (interquartile range) 2.1(1.8, 2.2) years on ART. INTERVENTION: Continuing versus stopping daily cotrimoxazole. MAIN OUTCOME MEASURES: Nurses screened for signs/symptoms at 6-week visits. This was a secondary analysis of ARROW trial data, with skin complaints categorized blind to randomization as bacterial, fungal, or viral infections; dermatitis; pruritic papular eruptions (PPEs); or others (blisters, desquamation, ulcers, and urticaria). Proportions ever reporting each skin complaint were compared across randomized groups using logistic regression. RESULTS: At randomization, median (interquartile range) age was 7 (4, 11) years and CD4 was 33% (26, 39); 73% had WHO stage 3/4 disease. Fewer children continuing cotrimoxazole reported bacterial skin infections over median 2 years follow-up (15 versus 33%, respectively; P < 0.001), with similar trends for PPE (P = 0.06) and other skin complaints (P = 0.11), but not for fungal (P = 0.45) or viral (P = 0.23) infections or dermatitis (P = 1.0). Bacterial skin infections were also reported at significantly fewer clinic visits (1.2 versus 3.0%, P < 0.001). Independent of cotrimoxazole, bacterial skin infections were more common in children aged 6-8 years, with current CD4 cell count less than 500 cells/μl, WHO stage 3/4, less time on ART, and lower socio-economic status. CONCLUSION: Long-term cotrimoxazole prophylaxis reduces common skin complaints, highlighting an additional benefit for long-term prophylaxis in sub-Saharan Africa.

Abongomera G, Cook A, Musiime V, Chabala C, Lamorde M, Abach J, Thomason M, Mulenga V, Kekitiinwa A, Colebunders R et al. 2017. Improved Adherence to Antiretroviral Therapy Observed Among HIV-Infected Children Whose Caregivers had Positive Beliefs in Medicine in Sub-Saharan Africa. AIDS Behav, 21 (2), pp. 441-449. | Show Abstract | Read more

A high level of adherence to antiretroviral treatment is essential for optimal clinical outcomes in HIV infection, but measuring adherence is difficult. We investigated whether responses to a questionnaire eliciting caregiver beliefs in medicines were associated with adherence of their child (median age 2.8 years), and whether this in turn was associated with viral suppression. We used the validated beliefs in medicine questionnaire (BMQ) to measure caregiver beliefs, and medication event monitoring system caps to measure adherence. We found significant associations between BMQ scores and adherence, and between adherence and viral suppression. Among children initiating Antiretroviral therapy (ART), we also found significant associations between BMQ 'necessity' scores, and BMQ 'necessity-concerns' scores, and later viral suppression. This suggests that the BMQ may be a valuable tool when used alongside other adherence measures, and that it remains important to keep caregivers well informed about the long-term necessity of their child's ART.

Bienczak A, Denti P, Cook A, Wiesner L, Mulenga V, Kityo C, Kekitiinwa A, Gibb DM, Burger D, Walker AS, McIlleron H. 2016. Plasma Efavirenz Exposure, Sex, and Age Predict Virological Response in HIV-Infected African Children. J Acquir Immune Defic Syndr, 73 (2), pp. 161-168. | Show Abstract | Read more

BACKGROUND: Owing to insufficient evidence in children, target plasma concentrations of efavirenz are based on studies in adults. Our analysis aimed to evaluate the pediatric therapeutic thresholds and characterize the determinants of virological suppression in African children. METHODS: We analyzed data from 128 African children (aged 1.7-13.5 years) treated with efavirenz, lamivudine, and one among abacavir, stavudine, or zidovudine, and followed up to 36 months. Individual pharmacokinetic (PK) measures [plasma concentration 12 hours after dose (C12h), plasma concentration 24 hours after dose (C24h), and area under the curve (AUC0-24)] were estimated using population PK modeling. Cox multiple failure regression and multivariable fractional polynomials were used to investigate the risks of unsuppressed viral load associated with efavirenz exposure and other factors among 106 initially treatment-naive children, and likelihood profiling was used to identify the most predictive PK thresholds. RESULTS: The risk of viral load >100 copies per milliliter decreased by 42% for every 2-fold increase in efavirenz mid-dose concentration [95% confidence interval (CI): 23% to 57%; P < 0.001]. The most predictive PK thresholds for increased risk of unsuppressed viral load were C12h 1.12 mg/L [hazard ratio (HR): 6.14; 95% CI: 2.64 to 14.27], C24h 0.65 mg/L (HR: 6.57; 95% CI: 2.86 to 15.10), and AUC0-24 28 mg·h/L (HR: 5.77; 95% CI: 2.28 to 14.58). Children older than 8 years had a more than 10-fold increased risk of virological nonsuppression (P = 0.005); among children younger than 8 years, boys had a 5.31 times higher risk than girls (P = 0.007). Central nervous system adverse events were infrequently reported. CONCLUSIONS: Our analysis suggests that the minimum target C24h and AUC0-24 could be lowered in children. Our findings should be confirmed in a prospective pediatric trial.

Shaw L, Harjunmaa U, Doyle R, Mulewa S, Charlie D, Maleta K, Callard R, Walker AS, Balloux F, Ashorn P, Klein N. 2016. Distinguishing the Signals of Gingivitis and Periodontitis in Supragingival Plaque: a Cross-Sectional Cohort Study in Malawi. Appl Environ Microbiol, 82 (19), pp. 6057-6067. | Show Abstract | Read more

UNLABELLED: Periodontal disease ranges from gingival inflammation (gingivitis) to the inflammation and loss of tooth-supporting tissues (periodontitis). Previous research has focused mainly on subgingival plaque, but supragingival plaque composition is also known to be associated with disease. Quantitative modeling of bacterial abundances across the natural range of periodontal severities can distinguish which features of disease are associated with particular changes in composition. We assessed a cross-sectional cohort of 962 Malawian women for periodontal disease and used 16S rRNA gene amplicon sequencing (V5 to V7 region) to characterize the bacterial compositions of supragingival plaque samples. Associations between bacterial relative abundances and gingivitis/periodontitis were investigated by using negative binomial models, adjusting for epidemiological factors. We also examined bacterial cooccurrence networks to assess community structure. The main differences in supragingival plaque compositions were associated more with gingivitis than periodontitis, including higher bacterial diversity and a greater abundance of particular species. However, even after controlling for gingivitis, the presence of subgingival periodontitis was associated with an altered supragingival plaque. A small number of species were associated with periodontitis but not gingivitis, including members of Prevotella, Treponema, and Selenomonas, supporting a more complex disease model than a linear progression following gingivitis. Cooccurrence networks of periodontitis-associated taxa clustered according to periodontitis across all gingivitis severities. Species including Filifactor alocis and Fusobacterium nucleatum were central to this network, which supports their role in the coaggregation of periodontal biofilms during disease progression. Our findings confirm that periodontitis cannot be considered simply an advanced stage of gingivitis even when only considering supragingival plaque. IMPORTANCE: Periodontal disease is a major public health problem associated with oral bacteria. While earlier studies focused on a small number of periodontal pathogens, it is now accepted that the whole bacterial community may be important. However, previous high-throughput marker gene sequencing studies of supragingival plaque have largely focused on high-income populations with good oral hygiene without including a range of periodontal disease severities. Our study includes a large number of low-income participants with poor oral hygiene and a wide range of severities, and we were therefore able to quantitatively model bacterial abundances as functions of both gingivitis and periodontitis. A signal associated with periodontitis remains after controlling for gingivitis severity, which supports the concept that, even when only considering supragingival plaque, periodontitis is not simply an advanced stage of gingivitis. This suggests the future possibility of diagnosing periodontitis based on bacterial occurrences in supragingival plaque.

Olson AD, Walker AS, Suthar AB, Sabin C, Bucher HC, Jarrin I, Moreno S, Perez-Hoyos S, Porter K, Ford D, CASCADE Collaboration in EuroCoord. 2016. Limiting Cumulative HIV Viremia Copy-Years by Early Treatment Reduces Risk of AIDS and Death. J Acquir Immune Defic Syndr, 73 (1), pp. 100-108. | Show Abstract | Read more

BACKGROUND: Viremia copy-years (VCY), a time-updated measure of cumulative HIV exposure, predicts AIDS/death; although its utility in deciding when to start combination antiretroviral therapy (cART) remains unclear. We aimed to assess the impact of initiating versus deferring cART on risk of AIDS/death by levels of VCY both independent of and within CD4 cell count strata ≥500 cells per cubic millimeter. METHODS: Using Concerted Action on Seroconversion to AIDS and Death in Europe (CASCADE) data, we created a series of nested "trials" corresponding to consecutive months for individuals ≥16 years at seroconversion after 1995 who were cART-naive and AIDS-free. Pooling across all trials, time to AIDS/death by CD4, and VCY strata was compared in those initiating vs. deferring cART using Cox models adjusted for: country, sex, risk group, seroconversion year, age, time since last HIV-RNA, and current CD4, VCY, HIV-RNA, and mean number of previous CD4/HIV-RNA measurements/year. RESULTS: Of 9353 individuals, 5312 (57%) initiated cART and 486 (5%) acquired AIDS/died. Pooling CD4 strata, risk of AIDS/death associated with initiating vs. deferring cART reduced as VCY increased. In patients with high CD4 cell counts, ≥500 cells per cubic millimeter, there was a trend for a greater reduction for those initiating vs. deferring with increasing VCY (P = 0.09), with the largest benefit in the VCY ≥100,000 copy-years/mL group [hazard ratio (95% CI) = 0.41 (0.19 to 0.87)]. CONCLUSIONS: For individuals with CD4 ≥500 cells per cubic millimeter, limiting the cumulative HIV burden to <100,000 copy-years/mL through cART may reduce the risk of AIDS/death.

Fawcett NJ, Jones N, Quan TP, Mistry V, Crook D, Peto T, Walker AS. 2016. Antibiotic use and clinical outcomes in the acute setting under management by an infectious diseases acute physician versus other clinical teams: a cohort study. BMJ Open, 6 (8), pp. e010969. | Show Abstract | Read more

OBJECTIVES: To assess the magnitude of difference in antibiotic use between clinical teams in the acute setting and assess evidence for any adverse consequences to patient safety or healthcare delivery. DESIGN: Prospective cohort study (1 week) and analysis of linked electronic health records (3 years). SETTING: UK tertiary care centre. PARTICIPANTS: All patients admitted sequentially to the acute medical service under an infectious diseases acute physician (IDP) and other medical teams during 1 week in 2013 (n=297), and 3 years 2012-2014 (n=47 585). PRIMARY OUTCOME MEASURE: Antibiotic use in days of therapy (DOT): raw group metrics and regression analysis adjusted for case mix. SECONDARY OUTCOME MEASURES: 30-day all-cause mortality, treatment failure and length of stay. RESULTS: Antibiotic use was 173 vs 282 DOT/100 admissions in the IDP versus non-IDP group. Using case mix-adjusted zero-inflated Poisson regression, IDP patients were significantly less likely to receive an antibiotic (adjusted OR=0.25 (95% CI 0.07 to 0.84), p=0.03) and received shorter courses (adjusted rate ratio (RR)=0.71 (95% CI 0.54 to 0.93), p=0.01). Clinically stable IDP patients of uncertain diagnosis were more likely to have antibiotics held (87% vs 55%; p=0.02). There was no significant difference in treatment failure or mortality (adjusted p>0.5; also in the 3-year data set), but IDP patients were more likely to be admitted overnight (adjusted OR=3.53 (95% CI 1.24 to 10.03), p=0.03) and have longer length of stay (adjusted RR=1.19 (95% CI 1.05 to 1.36), p=0.007). CONCLUSIONS: The IDP-led group used 30% less antibiotic therapy with no adverse clinical outcome, suggesting antibiotic use can be reduced safely in the acute setting. This may be achieved in part by holding antibiotics and admitting the patient for observation rather than prescribing, which has implications for costs and hospital occupancy. More information is needed to indicate whether any such longer admission will increase or decrease risk of antibiotic-resistant infections.

De Silva D, Peters J, Cole K, Cole MJ, Cresswell F, Dean G, Dave J, Thomas DR, Foster K, Waldram A et al. 2016. Whole-genome sequencing to determine transmission of Neisseria gonorrhoeae: an observational study. Lancet Infect Dis, 16 (11), pp. 1295-1303. | Show Abstract | Read more

BACKGROUND: New approaches are urgently required to address increasing rates of gonorrhoea and the emergence and global spread of antibiotic-resistant Neisseria gonorrhoeae. We used whole-genome sequencing to study transmission and track resistance in N gonorrhoeae isolates. METHODS: We did whole-genome sequencing of isolates obtained from samples collected from patients attending sexual health services in Brighton, UK, between Jan 1, 2011, and March 9, 2015. We also included isolates from other UK locations, historical isolates from Brighton, and previous data from a US study. Samples from symptomatic patients and asymptomatic sexual health screening underwent nucleic acid amplification testing; positive samples and all samples from symptomatic patients were cultured for N gonorrhoeae, and resulting isolates were whole-genome sequenced. Cefixime susceptibility testing was done in selected isolates by agar incorporation, and we used sequence data to determine multi-antigen sequence types and penA genotypes. We derived a transmission nomogram to determine the plausibility of direct or indirect transmission between any two cases depending on the time between samples: estimated mutation rates, plus diversity noted within patients across anatomical sites and probable transmission pairs, were used to fit a coalescent model to determine the number of single nucleotide polymorphisms expected. FINDINGS: 1407 (98%) of 1437 Brighton isolates between Jan 1, 2011, and March 9, 2015 were successfully sequenced. We identified 1061 infections from 907 patients. 281 (26%) of these infections were indistinguishable (ie, differed by zero single nucleotide polymorphisms) from one or more previous cases, and 786 (74%) had evidence of a sampled direct or indirect Brighton source. We observed multiple related samples across geographical locations. Of 1273 infections in Brighton (including historical data), 225 (18%) were linked to another case elsewhere in the UK, and 115 (9%) to a case in the USA. Four lineages initially identified in Brighton could be linked to 70 USA sequences, including 61 from a lineage carrying the mosaic penA XXXIV allele, which is associated with reduced cefixime susceptibility. INTERPRETATION: We present a whole-genome-sequencing-based tool for genomic contact tracing of N gonorrhoeae and demonstrate local, national, and international transmission. Whole-genome sequencing can be applied across geographical boundaries to investigate gonorrhoea transmission and to track antimicrobial resistance. FUNDING: Oxford National Institute for Health Research Health Protection Research Unit and Biomedical Research Centre.

Bienczak A, Cook A, Wiesner L, Olagunju A, Mulenga V, Kityo C, Kekitiinwa A, Owen A, Walker AS, Gibb DM et al. 2016. The impact of genetic polymorphisms on the pharmacokinetics of efavirenz in African children. Br J Clin Pharmacol, 82 (1), pp. 185-198. | Show Abstract | Read more

AIMS: Using a model-based approach, the efavirenz steady-state pharmacokinetics in African children is characterized, quantifying demographic and genotypic effects on the drug's disposition. Simulations are also conducted allowing prediction of optimized doses of efavirenz in this population. METHODS: We modelled the steady-state population pharmacokinetics of efavirenz in Ugandan and Zambian children using nonlinear mixed-effects modelling. Individual mid-dose efavirenz concentrations were derived and simulations explored genotype-based dose optimization strategies. RESULTS: A two-compartment model with absorption through transit compartments well described 2086 concentration-time points in 169 children. The combined effect of single nucleotide polymorphisms (SNPs) 516G>T and 983T>C explained 44.5% and 14.7% of the variability in efavirenz clearance and bioavailability, respectively. The detected frequencies of composite CYP2B6 genotype were 0.33 for 516GG|983TT, 0.35 for 516GT|983TT, 0.06 for 516GG|983TC, 0.18 for 516TT|983TT, 0.07 516GT|983TC and 0.01 for 516GG|983CC. The corresponding estimated clearance rates were 6.94, 4.90, 3.93, 1.92, 1.36, and 0.74 l h(-1) for a 15.4 kg child and median (95% CI) observed mid-dose concentrations 1.55 (0.51-2.94), 2.20 (0.97-4.40), 2.03 (1.19-4.53), 7.55 (2.40-14.74), 7.79 (3.66-24.59) and 18.22 (11.84-22.76) mg l(-1) , respectively. Simulations showed that wild-type individuals had exposures at the bottom of therapeutic range, while slower metabolizers were overexposed. CONCLUSIONS: Dosage guidelines for African children should take into consideration the combined effect of SNPs CYP2B6 516G>T and 983T>C.

Musiime V, Kasirye P, Naidoo-James B, Nahirya-Ntege P, Mhute T, Cook A, Mugarura L, Munjoma M, Thoofer NK, Ndashimye E et al. 2016. Once vs twice-daily abacavir and lamivudine in African children. AIDS, 30 (11), pp. 1761-1770. | Show Abstract | Read more

BACKGROUND: Antiretroviral therapy (ART) adherence is critical for successful HIV treatment outcomes. Once-daily dosing could improve adherence. Plasma concentrations of once-daily vs twice-daily abacavir + lamivudine are bioequivalent in children, but no randomized trial has compared virological outcomes. METHODS: Children taking abacavir + lamivudine-containing first-line regimens twice daily for more than 36 weeks in the ARROW trial (NCT02028676, ISRCTN24791884) were randomized to continue twice-daily vs move to once-daily abacavir + lamivudine (open-label). Co-primary outcomes were viral load suppression at week 48 (12% noninferiority margin, measured retrospectively) and lamivudine or abacavir-related grade 3/4 adverse events. RESULTS: Six hundred and sixty-nine children (median 5 years, range 1-16) were randomized to twice daily (n = 333) vs once daily (n = 336) after median 1.8 years on twice-daily abacavir + lamivudine-containing first-line ART. Children were followed for median 114 weeks. At week 48, 242/331 (73%) twice daily vs 236/330 (72%) once daily had viral load less than 80 copies/ml [difference -1.6% (95% confidence interval -8.4,+5.2%) P = 0.65]; 79% twice daily vs 78% once daily had viral load less than 400 copies/ml (P = 0.76) (week 96 results similar). One grade 3/4 adverse event was judged uncertainly related to abacavir + lamivudine (hepatitis; once daily). At week 48, 9% twice daily vs 10% once daily reported missing one or more ART pills in the last 4 weeks (P = 0.74) and 8 vs 8% at week 96 (P = 0.90). Carers strongly preferred once-daily dosing. There was no difference between randomized groups in postbaseline drug-resistance mutations or drug-susceptibility; WHO 3/4 events; ART-modifying, grade 3/4 or serious adverse events; CD4% or weight-for-age/height-for-age (all P > 0.15). CONCLUSION: Once-daily abacavir + lamivudine was noninferior to twice daily in viral load suppression, with similar resistance, adherence, clinical, immunological and safety outcomes. Abacavir + lamivudine provides the first once-daily nucleoside backbone across childhood that can be used to simplify ART.

Prendergast AJ, Szubert AJ, Berejena C, Pimundu G, Pala P, Shonhai A, Musiime V, Bwakura-Dangarembizi M, Poulsom H, Hunter P et al. 2016. Baseline Inflammatory Biomarkers Identify Subgroups of HIV-Infected African Children With Differing Responses to Antiretroviral Therapy. J Infect Dis, 214 (2), pp. 226-236. | Show Abstract | Read more

BACKGROUND: Identifying determinants of morbidity and mortality may help target future interventions for human immunodeficiency virus (HIV)-infected children. METHODS: CD4(+) T-cell count, HIV viral load, and levels of biomarkers (C-reactive protein, tumor necrosis factor α [TNF-α], interleukin 6 [IL-6], and soluble CD14) and interleukin 7 were measured at antiretroviral therapy (ART) initiation in the ARROW trial (case-cohort design). Cases were individuals who died, had new or recurrent World Health Organization clinical stage 4 events, or had poor immunological response to ART. RESULTS: There were 115 cases (54 died, 45 had World Health Organization clinical stage 4 events, and 49 had poor immunological response) and 485 controls. Before ART initiation, the median ages of cases and controls were 8.2 years (interquartile range [IQR], 4.4-11.4 years) and 5.8 years (IQR, 2.3-9.3 years), respectively, and the median percentages of lymphocytes expressing CD4 were 4% (IQR, 1%-9%) and 13% (IQR, 8%-18%), respectively. In multivariable logistic regression, cases had lower age-associated CD4(+) T-cell count ratio (calculated as the ratio of the subject's CD4(+) T-cell count to the count expected in healthy individuals of the same age; P < .0001) and higher IL-6 level (P = .002) than controls. Clustering biomarkers and age-associated CD4(+) and CD8(+) T-cell count ratios identified 4 groups of children. Group 1 had the highest frequency of cases (41% cases; 16% died) and profound immunosuppression; group 2 had similar mortality (23% cases; 15% died), but children were younger, with less profound immunosuppression and high levels of inflammatory biomarkers and malnutrition; group 3 comprised young children with moderate immunosuppression, high TNF-α levels, and high age-associated CD8(+) T-cell count ratios but lower frequencies of events (12% cases; 7% died); and group 4 comprised older children with low inflammatory biomarker levels, lower HIV viral loads, and good clinical outcomes (11% cases; 5% died). CONCLUSIONS: While immunosuppression is the major determinant of poor outcomes during ART, baseline inflammation is an additional important factor, identifying a subgroup of young children with similar mortality. Antiinflammatory interventions may help improve outcomes.

Quan TP, Fawcett NJ, Wrightson JM, Finney J, Wyllie D, Jeffery K, Jones N, Shine B, Clarke L, Crook D et al. 2016. Increasing burden of community-acquired pneumonia leading to hospitalisation, 1998-2014. Thorax, 71 (6), pp. 535-542. | Show Abstract | Read more

BACKGROUND: Community-acquired pneumonia (CAP) is a major cause of mortality and morbidity in many countries but few recent large-scale studies have examined trends in its incidence. METHODS: Incidence of CAP leading to hospitalisation in one UK region (Oxfordshire) was calculated over calendar time using routinely collected diagnostic codes, and modelled using piecewise-linear Poisson regression. Further models considered other related diagnoses, typical administrative outcomes, and blood and microbiology test results at admission to determine whether CAP trends could be explained by changes in case-mix, coding practices or admission procedures. RESULTS: CAP increased by 4.2%/year (95% CI 3.6 to 4.8) from 1998 to 2008, and subsequently much faster at 8.8%/year (95% CI 7.8 to 9.7) from 2009 to 2014. Pneumonia-related conditions also increased significantly over this period. Length of stay and 30-day mortality decreased slightly in later years, but the proportions with abnormal neutrophils, urea and C reactive protein (CRP) did not change (p>0.2). The proportion with severely abnormal CRP (>100 mg/L) decreased slightly in later years. Trends were similar in all age groups. Streptococcus pneumoniae was the most common causative organism found; however other organisms, particularly Enterobacteriaceae, increased in incidence over the study period (p<0.001). CONCLUSIONS: Hospitalisations for CAP have been increasing rapidly in Oxfordshire, particularly since 2008. There is little evidence that this is due only to changes in pneumonia coding, an ageing population or patients with substantially less severe disease being admitted more frequently. Healthcare planning to address potential further increases in admissions and consequent antibiotic prescribing should be a priority.

Sheppard AE, Stoesser N, Wilson DJ, Sebra R, Kasarskis A, Anson LW, Giess A, Pankhurst LJ, Vaughan A, Grim CJ et al. 2016. Nested Russian Doll-Like Genetic Mobility Drives Rapid Dissemination of the Carbapenem Resistance Gene blaKPC. Antimicrob Agents Chemother, 60 (6), pp. 3767-3778. | Show Abstract | Read more

The recent widespread emergence of carbapenem resistance in Enterobacteriaceae is a major public health concern, as carbapenems are a therapy of last resort against this family of common bacterial pathogens. Resistance genes can mobilize via various mechanisms, including conjugation and transposition; however, the importance of this mobility in short-term evolution, such as within nosocomial outbreaks, is unknown. Using a combination of short- and long-read whole-genome sequencing of 281 blaKPC-positive Enterobacteriaceae isolates from a single hospital over 5 years, we demonstrate rapid dissemination of this carbapenem resistance gene to multiple species, strains, and plasmids. Mobility of blaKPC occurs at multiple nested genetic levels, with transmission of blaKPC strains between individuals, frequent transfer of blaKPC plasmids between strains/species, and frequent transposition of blaKPC transposon Tn4401 between plasmids. We also identify a common insertion site for Tn4401 within various Tn2-like elements, suggesting that homologous recombination between Tn2-like elements has enhanced the spread of Tn4401 between different plasmid vectors. Furthermore, while short-read sequencing has known limitations for plasmid assembly, various studies have attempted to overcome this by the use of reference-based methods. We also demonstrate that, as a consequence of the genetic mobility observed in this study, plasmid structures can be extremely dynamic, and therefore these reference-based methods, as well as traditional partial typing methods, can produce very misleading conclusions. Overall, our findings demonstrate that nonclonal resistance gene dissemination can be extremely rapid, presenting significant challenges for public health surveillance and achieving effective control of antibiotic resistance.

Seale AC, Koech AC, Sheppard AE, Barsosio HC, Langat J, Anyango E, Mwakio S, Mwarumba S, Morpeth SC, Anampiu K et al. 2016. Maternal colonisation with Streptococcus agalactiae, and associated stillbirth and neonatal disease in coastal Kenya. Nat Microbiol, 1 (7), | Show Abstract | Read more

Streptococcus agalactiae (Group B Streptococcus, GBS) causes neonatal disease and stillbirth, but its burden in sub-Saharan Africa is uncertain. We assessed maternal recto-vaginal GBS colonisation (7967 women), stillbirth and neonatal disease. Whole genome sequencing was used to determine serotypes, sequence types (ST), and phylogeny. We found low maternal GBS colonisation prevalence (934/7967, 12%), but comparatively high incidence of GBS-associated stillbirth and early onset neonatal disease (EOD) in hospital (0.91(0.25-2.3)/1000 births; 0.76(0.25-1.77)/1000 live-births respectively). However, using a population denominator, EOD incidence was considerably reduced (0.13(0.07-0.21)/1000 live-births). Treated cases of EOD had very high case fatality (17/36, 47%), especially within 24 hours of birth, making under-ascertainment of community-born cases highly likely, both here and in similar facility-based studies. Maternal GBS colonisation was less common in women with low socio-economic status, HIV infection and undernutrition, but when GBS-colonised, they were more likely colonised by the most virulent clone, CC17. CC17 accounted for 267/915(29%) of maternal colonising (265/267(99%) serotype III, 2/267(0.7%) serotype IV), and 51/73(70%) of neonatal disease cases (all serotype III). Trivalent (Ia/II/III) and pentavalent (Ia/Ib/II/III/V) vaccines would cover 71/73(97%) and 72/73(99%) of disease-causing serotypes respectively. Serotype IV should be considered for inclusion, with evidence of capsular switching in CC17 strains.

Seale AC, Koech AC, Sheppard AE, Barsosio HC, Langat J, Anyango E, Mwakio S, Mwarumba S, Morpeth SC, Anampiu K et al. 2016. Maternal colonization with Streptococcus agalactiae and associated stillbirth and neonatal disease in coastal Kenya. Nat Microbiol, 1 (7), pp. 16067. | Show Abstract | Read more

Streptococcus agalactiae (group B streptococcus, GBS) causes neonatal disease and stillbirth, but its burden in sub-Saharan Africa is uncertain. We assessed maternal recto-vaginal GBS colonization (7,967 women), stillbirth and neonatal disease. Whole-genome sequencing was used to determine serotypes, sequence types and phylogeny. We found low maternal GBS colonization prevalence (934/7,967, 12%), but comparatively high incidence of GBS-associated stillbirth and early onset neonatal disease (EOD) in hospital (0.91 (0.25-2.3)/1,000 births and 0.76 (0.25-1.77)/1,000 live births, respectively). However, using a population denominator, EOD incidence was considerably reduced (0.13 (0.07-0.21)/1,000 live births). Treated cases of EOD had very high case fatality (17/36, 47%), especially within 24 h of birth, making under-ascertainment of community-born cases highly likely, both here and in similar facility-based studies. Maternal GBS colonization was less common in women with low socio-economic status, HIV infection and undernutrition, but when GBS-colonized, they were more probably colonized by the most virulent clone, CC17. CC17 accounted for 267/915 (29%) of maternal colonizing (265/267 (99%) serotype III; 2/267 (0.7%) serotype IV) and 51/73 (70%) of neonatal disease cases (all serotype III). Trivalent (Ia/II/III) and pentavalent (Ia/Ib/II/III/V) vaccines would cover 71/73 (97%) and 72/73 (99%) of disease-causing serotypes, respectively. Serotype IV should be considered for inclusion, with evidence of capsular switching in CC17 strains.

Bradley P, Gordon NC, Walker TM, Dunn L, Heys S, Huang B, Earle S, Pankhurst LJ, Anson L, de Cesare M et al. 2016. Corrigendum: Rapid antibiotic-resistance predictions from genome sequence data for Staphylococcus aureus and Mycobacterium tuberculosis. Nat Commun, 7 (1), pp. 11465. | Read more

Earle SG, Wu C-H, Charlesworth J, Stoesser N, Gordon NC, Walker TM, Spencer CCA, Iqbal Z, Clifton DA, Hopkins KL et al. 2016. Identifying lineage effects when controlling for population structure improves power in bacterial association studies. Nat Microbiol, 1 (5), pp. 16041. | Show Abstract | Read more

Bacteria pose unique challenges for genome-wide association studies because of strong structuring into distinct strains and substantial linkage disequilibrium across the genome(1,2). Although methods developed for human studies can correct for strain structure(3,4), this risks considerable loss-of-power because genetic differences between strains often contribute substantial phenotypic variability(5). Here, we propose a new method that captures lineage-level associations even when locus-specific associations cannot be fine-mapped. We demonstrate its ability to detect genes and genetic variants underlying resistance to 17 antimicrobials in 3,144 isolates from four taxonomically diverse clonal and recombining bacteria: Mycobacterium tuberculosis, Staphylococcus aureus, Escherichia coli and Klebsiella pneumoniae. Strong selection, recombination and penetrance confer high power to recover known antimicrobial resistance mechanisms and reveal a candidate association between the outer membrane porin nmpC and cefazolin resistance in E. coli. Hence, our method pinpoints locus-specific effects where possible and boosts power by detecting lineage-level differences when fine-mapping is intractable.

Mead S, Burnell M, Lowe J, Thompson A, Lukic A, Porter M-C, Carswell C, Kaski D, Kenny J, Mok TH et al. 2016. Clinical Trial Simulations Based on Genetic Stratification and the Natural History of a Functional Outcome Measure in Creutzfeldt-Jakob Disease. JAMA Neurol, 73 (4), pp. 447-455. | Show Abstract | Read more

IMPORTANCE: A major challenge for drug development in neurodegenerative diseases is that adequately powered efficacy studies with meaningful end points typically require several hundred participants and long durations. Prion diseases represent the archetype of brain diseases caused by protein misfolding, the most common subtype being sporadic Creutzfeldt-Jakob disease (sCJD), a rapidly progressive dementia. There is no well-established trial method in prion disease. OBJECTIVE: To establish a more powerful and meaningful clinical trial method in sCJD. DESIGN, SETTING, AND PARTICIPANTS: A stratified medicine and simulation approach based on a prospective interval-cohort study conducted from October 2008 to June 2014. This study involved 598 participants with probable or definite sCJD followed up over 470 patient-years at a specialist national referral service in the United Kingdom with domiciliary, care home, and hospital patient visits. We fitted linear mixed models to the outcome measurements, and simulated clinical trials involving 10 to 120 patients (no dropouts) with early to moderately advanced prion disease using model parameters to compare the power of various designs. MAIN OUTCOMES AND MEASURES: A total of 2681 assessments were done using a functionally orientated composite end point (Medical Research Council Scale) and associated with clinical investigations (brain magnetic resonance imaging, electroencephalography, and cerebrospinal fluid analysis) and molecular data (prion protein [PrP] gene sequencing, PrPSc type). RESULTS: Of the 598 participants, 273 were men. The PrP gene sequence was significantly associated with decline relative to any other demographic or investigation factors. Patients with sCJD and polymorphic codon 129 genotypes MM, VV, and MV lost 10% of their function in 5.3 (95% CI, 4.2-6.9), 13.2 (95% CI, 10.9-16.6), and 27.8 (95% CI, 21.9-37.8) days, respectively (P < .001). Simulations indicate that an adequately powered (80%; 2-sided α = .05) open-label randomized trial using 50% reduction in Medical Research Council Scale decline as the primary outcome could be conducted with only 120 participants assessed every 10 days and only 90 participants assessed daily, providing considerably more power than using survival as the primary outcome. Restricting to VV or MV codon 129 genotypes increased power even further. Alternatively, single-arm intervention studies (half the total sample size) could provide similar power in comparison to the natural history cohort. CONCLUSIONS AND RELEVANCE: Functional end points in neurodegeneration need not require long and very large clinical studies to be adequately powered for efficacy. Patients with sCJD may be an efficient and cost-effective group for testing disease-modifying therapeutics. Stratified medicine and natural history cohort approaches may transform the feasibility of clinical trials in orphan diseases.

Kekitiinwa A, Musiime V, Thomason MJ, Mirembe G, Lallemant M, Nakalanzi S, Baptiste D, Walker AS, Gibb DM, Judd A. 2016. Acceptability of lopinavir/r pellets (minitabs), tablets and syrups in HIV-infected children. Antivir Ther, 21 (7), pp. 579-585. | Show Abstract | Read more

BACKGROUND: Lopinavir/ritonavir 'pellets' were recently tentatively approved for licensing. We describe their acceptability for infants and children up to 48 weeks. METHODS: CHAPAS-2 was a randomized, two-period crossover trial comparing syrup and pellets in HIV-infected infants (n=19, group A, aged 3-<12 months) and children (n=26, group B, 1-<4 years) and tablets and pellets in older children (n=32, group C, 4-<13 years) from two clinics ('JCRC', 'PIDC') in Uganda. At week 8, all groups chose which formulation to continue. Acceptability data were collected at weeks 0, 4, 8, 12 and 48. RESULTS: For groups A and B overall, the proportion preferring pellets increased between week 0 and week 12 and decreased at week 48 (group A 37%, 72%, 44%; group B 12%, 64% and 36%, respectively), although there were marked differences between clinics. For group C, pellets were progressively less preferred to tablets over time: 41%, 19% and 13% at weeks 0, 12 and 48, respectively. During follow-up unpleasant taste was similarly reported among young children taking pellets and syrups (37%/43% group A; 29%/26% group B), whereas among older children, pellets tasted worse than tablets (40%/2%). No participants reported problems with storage/transportation for pellets (0%/0%) unlike syrups (23%/13%). CONCLUSIONS: For children <4 years, pellets were more acceptable at week 12 but not week 48. Clinic differences could reflect bias among health-care workers for different formulations. Pellets taste similar to syrup, are easier to store and transport than syrup and represent an alternative formulation for young children unable to swallow tablets; improvements in taste and support for health-care workers may help sustain acceptability.

Kambugu A, Thompson J, Hakim J, Tumukunde D, van Oosterhout JJ, Mwebaze R, Hoppe A, Abach J, Kwobah C, Arenas-Pinto A et al. 2016. Neurocognitive Function at the First-Line Failure and on the Second-Line Antiretroviral Therapy in Africa: Analyses From the EARNEST Trial. J Acquir Immune Defic Syndr, 71 (5), pp. 506-513. | Show Abstract | Read more

OBJECTIVE: To assess neurocognitive function at the first-line antiretroviral therapy failure and change on the second-line therapy. DESIGN: Randomized controlled trial was conducted in 5 sub-Saharan African countries. METHODS: Patients failing the first-line therapy according to WHO criteria after >12 months on non-nucleoside reverse transcriptase inhibitors-based regimens were randomized to the second-line therapy (open-label) with lopinavir/ritonavir (400 mg/100 mg twice daily) plus either 2-3 clinician-selected nucleoside reverse transcriptase inhibitors, raltegravir, or as monotherapy after 12-week induction with raltegravir. Neurocognitive function was tested at baseline, weeks 48 and 96 using color trails tests 1 and 2, and the Grooved Pegboard test. Test results were converted to an average of the 3 individual test z-scores. RESULTS: A total of 1036 patients (90% of those >18 years enrolled at 13 evaluable sites) had valid baseline tests (58% women, median: 38 years, viral load: 65,000 copies per milliliter, CD4 count: 73 cells per cubic millimeter). Mean (SD) baseline z-score was -2.96 (1.74); lower baseline z-scores were independently associated with older age, lower body weight, higher viral load, lower hemoglobin, less education, fewer weekly working hours, previous central nervous system disease, and taking fluconazole (P < 0.05 in multivariable model). Z-score was increased by mean (SE) of +1.23 (0.04) after 96 weeks on the second-line therapy (P < 0.001; n = 915 evaluable), with no evidence of difference between the treatment arms (P = 0.35). CONCLUSIONS: Patients in sub-Saharan Africa failing the first-line therapy had low neurocognitive function test scores, but performance improved on the second-line therapy. Regimens with more central nervous system-penetrating drugs did not enhance neurocognitive recovery indicating this need not be a primary consideration in choosing a second-line regimen.

Crook AM, Turkova A, Musiime V, Bwakura-Dangarembizi M, Bakeera-Kitaka S, Nahirya-Ntege P, Thomason M, Mugyenyi P, Musoke P, Kekitiinwa A et al. 2016. Tuberculosis incidence is high in HIV-infected African children but is reduced by co-trimoxazole and time on antiretroviral therapy. BMC Med, 14 (1), pp. 50. | Show Abstract | Read more

BACKGROUND: There are few data on tuberculosis (TB) incidence in HIV-infected children on antiretroviral therapy (ART). Observational studies suggest co-trimoxazole prophylaxis may prevent TB, but there are no randomized data supporting this. The ARROW trial, which enrolled HIV-infected children initiating ART in Uganda and Zimbabwe and included randomized cessation of co-trimoxazole prophylaxis, provided an opportunity to estimate the incidence of TB over time, to explore potential risk factors for TB, and to evaluate the effect of stopping co-trimoxazole prophylaxis. METHODS: Of 1,206 children enrolled in ARROW, there were 969 children with no previous TB history. After 96 weeks on ART, children older than 3 years were randomized to stop or continue co-trimoxazole prophylaxis; 622 were eligible and included in the co-trimoxazole analysis. Endpoints, including TB, were adjudicated blind to randomization by an independent endpoint review committee (ERC). Crude incidence rates of TB were estimated and potential risk factors, including age, sex, center, CD4, weight, height, and initial ART strategy, were explored in multivariable Cox proportional hazards models. RESULTS: After a median of 4 years follow-up (3,632 child-years), 69 children had an ERC-confirmed TB diagnosis. The overall TB incidence was 1.9/100 child-years (95% CI, 1.5-2.4), and was highest in the first 12 weeks following ART initiation (8.8/100 child-years (5.2-13.4) versus 1.2/100 child-years (0.8-1.6) after 52 weeks). A higher TB risk was independently associated with younger age (<3 years), female sex, lower pre-ART weight-for-age Z-score, and current CD4 percent; fewer TB diagnoses were observed in children on maintenance triple nucleoside reverse transcriptase inhibitor (NRTI) ART compared to standard non-NRTI + 2NRTI. Over the median 2 years of follow-up, there were 20 ERC-adjudicated TB cases among 622 children in the co-trimoxazole analysis: 5 in the continue arm and 15 in the stop arm (hazard ratio (stop: continue) = 3.0 (95% CI, 1.1-8.3), P = 0.028). TB risk was also independently associated with lower current CD4 percent (P <0.001). CONCLUSIONS: TB incidence varies over time following ART initiation, and is particularly high during the first 3 months post-ART, reinforcing the importance of TB screening prior to starting ART and use of isoniazid preventive therapy once active TB is excluded. HIV-infected children continuing co-trimoxazole prophylaxis after 96 weeks of ART were diagnosed with TB less frequently, highlighting a potentially important role of co-trimoxazole in preventing TB.

Stoesser N, Sheppard AE, Pankhurst L, De Maio N, Moore CE, Sebra R, Turner P, Anson LW, Kasarskis A, Batty EM et al. 2016. Evolutionary History of the Global Emergence of the Escherichia coli Epidemic Clone ST131. mBio, 7 (2), pp. e02162. | Show Abstract | Read more

UNLABELLED: Escherichia colisequence type 131 (ST131) has emerged globally as the most predominant extraintestinal pathogenic lineage within this clinically important species, and its association with fluoroquinolone and extended-spectrum cephalosporin resistance impacts significantly on treatment. The evolutionary histories of this lineage, and of important antimicrobial resistance elements within it, remain unclearly defined. This study of the largest worldwide collection (n= 215) of sequenced ST131E. coliisolates to date demonstrates that the clonal expansion of two previously recognized antimicrobial-resistant clades, C1/H30R and C2/H30Rx, started around 25 years ago, consistent with the widespread introduction of fluoroquinolones and extended-spectrum cephalosporins in clinical medicine. These two clades appear to have emerged in the United States, with the expansion of the C2/H30Rx clade driven by the acquisition of ablaCTX-M-15-containing IncFII-like plasmid that has subsequently undergone extensive rearrangement. Several other evolutionary processes influencing the trajectory of this drug-resistant lineage are described, including sporadic acquisitions of CTX-M resistance plasmids and chromosomal integration ofblaCTX-Mwithin subclusters followed by vertical evolution. These processes are also occurring for another family of CTX-M gene variants more recently observed among ST131, theblaCTX-M-14/14-likegroup. The complexity of the evolutionary history of ST131 has important implications for antimicrobial resistance surveillance, epidemiological analysis, and control of emerging clinical lineages ofE. coli These data also highlight the global imperative to reduce specific antibiotic selection pressures and demonstrate the important and varied roles played by plasmids and other mobile genetic elements in the perpetuation of antimicrobial resistance within lineages. IMPORTANCE: Escherichia coli, perennially a major bacterial pathogen, is becoming increasingly difficult to manage due to emerging resistance to all preferred antimicrobials. Resistance is concentrated within specificE. colilineages, such as sequence type 131 (ST131). Clarification of the genetic basis for clonally associated resistance is key to devising intervention strategies. We used high-resolution genomic analysis of a large global collection of ST131 isolates to define the evolutionary history of extended-spectrum beta-lactamase production in ST131. We documented diverse contributory genetic processes, including stable chromosomal integrations of resistance genes, persistence and evolution of mobile resistance elements within sublineages, and sporadic acquisition of different resistance elements. Both global distribution and regional segregation were evident. The diversity of resistance element acquisition and propagation within ST131 indicates a need for control and surveillance strategies that target both bacterial strains and mobile genetic elements.

Phillips PPJ, Morris TP, Walker AS. 2016. DOOR/RADAR: A Gateway Into the Unknown? Clin Infect Dis, 62 (6), pp. 814-815. | Read more

Sheppard AE, Vaughan A, Jones N, Turner P, Turner C, Efstratiou A, Patel D, Modernising Medical Microbiology Informatics Group, Walker AS, Berkley JA et al. 2016. Capsular Typing Method for Streptococcus agalactiae Using Whole-Genome Sequence Data. J Clin Microbiol, 54 (5), pp. 1388-1390. | Show Abstract | Read more

Group B streptococcus (GBS) capsular serotypes are major determinants of virulence and affect potential vaccine coverage. Here we report a whole-genome-sequencing-based method for GBS serotype assignment. This method shows strong agreement (kappa of 0.92) with conventional methods and increased serotype assignment (100%) to all 10 capsular types.

Sheppard AE, Stoesser N, Sebra R, Kasarskis A, Deikus G, Anson L, Walker AS, Peto TE, Crook DW, Mathers AJ. 2016. Complete Genome Sequence of KPC-Producing Klebsiella pneumoniae Strain CAV1193. Genome Announc, 4 (1), | Show Abstract | Read more

Carbapenem resistance in Klebsiella pneumoniae, frequently conferred by the blaKPC gene, is a major public health threat. We sequenced a blaKPC-containing strain of K. pneumoniae belonging to the emergent lineage ST941, in order to better understand the evolution of blaKPC within this species.

Didelot X, Walker AS, Peto TE, Crook DW, Wilson DJ. 2016. Within-host evolution of bacterial pathogens. Nat Rev Microbiol, 14 (3), pp. 150-162. | Show Abstract | Read more

Whole-genome sequencing has opened the way for investigating the dynamics and genomic evolution of bacterial pathogens during the colonization and infection of humans. The application of this technology to the longitudinal study of adaptation in an infected host--in particular, the evolution of drug resistance and host adaptation in patients who are chronically infected with opportunistic pathogens--has revealed remarkable patterns of convergent evolution, suggestive of an inherent repeatability of evolution. In this Review, we describe how these studies have advanced our understanding of the mechanisms and principles of within-host genome evolution, and we consider the consequences of findings such as a potent adaptive potential for pathogenicity. Finally, we discuss the possibility that genomics may be used in the future to predict the clinical progression of bacterial infections and to suggest the best option for treatment.

Mpoya A, Kiguli S, Olupot-Olupot P, Opoka RO, Engoru C, Mallewa M, Chimalizeni Y, Kennedy N, Kyeyune D, Wabwire B et al. 2015. Transfusion and Treatment of severe anaemia in African children (TRACT): a study protocol for a randomised controlled trial. Trials, 16 (1), pp. 593. | Show Abstract | Read more

BACKGROUND: In sub-Saharan Africa, where infectious diseases and nutritional deficiencies are common, severe anaemia is a common cause of paediatric hospital admission, yet the evidence to support current treatment recommendations is limited. To avert overuse of blood products, the World Health Organisation advocates a conservative transfusion policy and recommends iron, folate and anti-helminthics at discharge. Outcomes are unsatisfactory with high rates of in-hospital mortality (9-10%), 6-month mortality and relapse (6%). A definitive trial to establish best transfusion and treatment strategies to prevent both early and delayed mortality and relapse is warranted. METHODS/DESIGN: TRACT is a multicentre randomised controlled trial of 3954 children aged 2 months to 12 years admitted to hospital with severe anaemia (haemoglobin < 6 g/dl). Children will be enrolled over 2 years in 4 centres in Uganda and Malawi and followed for 6 months. The trial will simultaneously evaluate (in a factorial trial with a 3 x 2 x 2 design) 3 ways to reduce short-term and longer-term mortality and morbidity following admission to hospital with severe anaemia in African children. The trial will compare: (i) R1: liberal transfusion (30 ml/kg whole blood) versus conservative transfusion (20 ml/kg) versus no transfusion (control). The control is only for children with uncomplicated severe anaemia (haemoglobin 4-6 g/dl); (ii) R2: post-discharge multi-vitamin multi-mineral supplementation (including folate and iron) versus routine care (folate and iron) for 3 months; (iii) R3: post-discharge cotrimoxazole prophylaxis for 3 months versus no prophylaxis. All randomisations are open. Enrolment to the trial started September 2014 and is currently ongoing. Primary outcome is cumulative mortality to 4 weeks for the transfusion strategy comparisons, and to 6 months for the nutritional support/antibiotic prophylaxis comparisons. Secondary outcomes include mortality, morbidity (haematological correction, nutritional and infectious), safety and cost-effectiveness. DISCUSSION: If confirmed by the trial, a cheap and widely available 'bundle' of effective interventions, directed at immediate and downstream consequences of severe anaemia, could lead to substantial reductions in mortality in a substantial number of African children hospitalised with severe anaemia every year, if widely implemented. TRIAL REGISTRATION: Current Controlled Trials ISRCTN84086586 , Approved 11 February 2013.

Bradley P, Gordon NC, Walker TM, Dunn L, Heys S, Huang B, Earle S, Pankhurst LJ, Anson L, de Cesare M et al. 2015. Rapid antibiotic-resistance predictions from genome sequence data for Staphylococcus aureus and Mycobacterium tuberculosis. Nat Commun, 6 (1), pp. 10063. | Show Abstract | Read more

The rise of antibiotic-resistant bacteria has led to an urgent need for rapid detection of drug resistance in clinical samples, and improvements in global surveillance. Here we show how de Bruijn graph representation of bacterial diversity can be used to identify species and resistance profiles of clinical isolates. We implement this method for Staphylococcus aureus and Mycobacterium tuberculosis in a software package ('Mykrobe predictor') that takes raw sequence data as input, and generates a clinician-friendly report within 3 minutes on a laptop. For S. aureus, the error rates of our method are comparable to gold-standard phenotypic methods, with sensitivity/specificity of 99.1%/99.6% across 12 antibiotics (using an independent validation set, n=470). For M. tuberculosis, our method predicts resistance with sensitivity/specificity of 82.6%/98.5% (independent validation set, n=1,609); sensitivity is lower here, probably because of limited understanding of the underlying genetic mechanisms. We give evidence that minor alleles improve detection of extremely drug-resistant strains, and demonstrate feasibility of the use of emerging single-molecule nanopore sequencing techniques for these purposes.

Li HK, Scarborough M, Zambellas R, Cooper C, Rombach I, Walker AS, Lipsky BA, Briggs A, Seaton A, Atkins B et al. 2015. Oral versus intravenous antibiotic treatment for bone and joint infections (OVIVA): study protocol for a randomised controlled trial. Trials, 16 (1), pp. 583. | Show Abstract | Read more

BACKGROUND: Bone and joint infection in adults arises most commonly as a complication of joint replacement surgery, fracture fixation and diabetic foot infection. The associated morbidity can be devastating to patients and costs the National Health Service an estimated £20,000 to £40,000 per patient. Current standard of care in most UK centres includes a prolonged course (4-6 weeks) of intravenous antibiotics supported, if available, by an outpatient parenteral antibiotic therapy service. Intravenous therapy carries with it substantial risks and inconvenience to patients, and the antibiotic-related costs are approximately ten times that of oral therapy. Despite this, there is no evidence to suggest that oral therapy results in inferior outcomes. We hypothesise that, by selecting oral agents with high bioavailability, good tissue penetration and activity against the known or likely pathogens, key outcomes in patients managed primarily with oral therapy are non-inferior to those in patients treated by intravenous therapy. METHODS: The OVIVA trial is a parallel group, randomised (1:1), un-blinded, non-inferiority trial conducted in thirty hospitals across the UK. Eligible participants are adults (>18 years) with a clinical syndrome consistent with a bone, joint or metalware-associated infection who have received ≤7 days of intravenous antibiotic therapy from the date of definitive surgery (or the start of planned curative therapy in patients treated without surgical intervention). Participants are randomised to receive either oral or intravenous antibiotics, selected by a specialist infection physician, for the first 6 weeks of therapy. The primary outcome measure is definite treatment failure within one year of randomisation, as assessed by a blinded endpoint committee, according to pre-defined microbiological, histological and clinical criteria. Enrolling 1,050 subjects will provide 90 % power to demonstrate non-inferiority, defined as less than 7.5 % absolute increase in treatment failure rate in patients randomised to oral therapy as compared to intravenous therapy (one-sided alpha of 0.05). DISCUSSION: If our results demonstrate non-inferiority of orally administered antibiotic therapy, this trial is likely to facilitate a dramatically improved patient experience and alleviate a substantial financial burden on healthcare services. TRIAL REGISTRATION: ISRCTN91566927 - 14/02/2013.

Mulenga V, Musiime V, Kekitiinwa A, Cook AD, Abongomera G, Kenny J, Chabala C, Mirembe G, Asiimwe A, Owen-Powell E et al. 2016. Abacavir, zidovudine, or stavudine as paediatric tablets for African HIV-infected children (CHAPAS-3): an open-label, parallel-group, randomised controlled trial. Lancet Infect Dis, 16 (2), pp. 169-179. | Show Abstract | Read more

BACKGROUND: WHO 2013 guidelines recommend universal treatment for HIV-infected children younger than 5 years. No paediatric trials have compared nucleoside reverse-transcriptase inhibitors (NRTIs) in first-line antiretroviral therapy (ART) in Africa, where most HIV-infected children live. We aimed to compare stavudine, zidovudine, or abacavir as dual or triple fixed-dose-combination paediatric tablets with lamivudine and nevirapine or efavirenz. METHODS: In this open-label, parallel-group, randomised trial (CHAPAS-3), we enrolled children from one centre in Zambia and three in Uganda who were previously untreated (ART naive) or on stavudine for more than 2 years with viral load less than 50 copies per mL (ART experienced). Computer-generated randomisation tables were incorporated securely within the database. The primary endpoint was grade 2-4 clinical or grade 3/4 laboratory adverse events. Analysis was intention to treat. This trial is registered with the ISRCTN Registry number, 69078957. FINDINGS: Between Nov 8, 2010, and Dec 28, 2011, 480 children were randomised: 156 to stavudine, 159 to zidovudine, and 165 to abacavir. After two were excluded due to randomisation error, 156 children were analysed in the stavudine group, 158 in the zidovudine group, and 164 in the abacavir group, and followed for median 2·3 years (5% lost to follow-up). 365 (76%) were ART naive (median age 2·6 years vs 6·2 years in ART experienced). 917 grade 2-4 clinical or grade 3/4 laboratory adverse events (835 clinical [634 grade 2]; 40 laboratory) occurred in 104 (67%) children on stavudine, 103 (65%) on zidovudine, and 105 (64%), on abacavir (p=0·63; zidovudine vs stavudine: hazard ratio [HR] 0·99 [95% CI 0·75-1·29]; abacavir vs stavudine: HR 0·88 [0·67-1·15]). At 48 weeks, 98 (85%), 81 (80%) and 95 (81%) ART-naive children in the stavudine, zidovudine, and abacavir groups, respectively, had viral load less than 400 copies per mL (p=0·58); most ART-experienced children maintained suppression (p=1·00). INTERPRETATION: All NRTIs had low toxicity and good clinical, immunological, and virological responses. Clinical and subclinical lipodystrophy was not noted in those younger than 5 years and anaemia was no more frequent with zidovudine than with the other drugs. Absence of hypersensitivity reactions, superior resistance profile and once-daily dosing favours abacavir for African children, supporting WHO 2013 guidelines. FUNDING: European Developing Countries Clinical Trials Partnership.

Walker TM, Kohl TA, Omar SV, Hedge J, Del Ojo Elias C, Bradley P, Iqbal Z, Feuerriegel S, Niehaus KE, Wilson DJ et al. 2015. Whole-genome sequencing for prediction of Mycobacterium tuberculosis drug susceptibility and resistance: a retrospective cohort study. Lancet Infect Dis, 15 (10), pp. 1193-1202. | Show Abstract | Read more

BACKGROUND: Diagnosing drug-resistance remains an obstacle to the elimination of tuberculosis. Phenotypic drug-susceptibility testing is slow and expensive, and commercial genotypic assays screen only common resistance-determining mutations. We used whole-genome sequencing to characterise common and rare mutations predicting drug resistance, or consistency with susceptibility, for all first-line and second-line drugs for tuberculosis. METHODS: Between Sept 1, 2010, and Dec 1, 2013, we sequenced a training set of 2099 Mycobacterium tuberculosis genomes. For 23 candidate genes identified from the drug-resistance scientific literature, we algorithmically characterised genetic mutations as not conferring resistance (benign), resistance determinants, or uncharacterised. We then assessed the ability of these characterisations to predict phenotypic drug-susceptibility testing for an independent validation set of 1552 genomes. We sought mutations under similar selection pressure to those characterised as resistance determinants outside candidate genes to account for residual phenotypic resistance. FINDINGS: We characterised 120 training-set mutations as resistance determining, and 772 as benign. With these mutations, we could predict 89·2% of the validation-set phenotypes with a mean 92·3% sensitivity (95% CI 90·7-93·7) and 98·4% specificity (98·1-98·7). 10·8% of validation-set phenotypes could not be predicted because uncharacterised mutations were present. With an in-silico comparison, characterised resistance determinants had higher sensitivity than the mutations from three line-probe assays (85·1% vs 81·6%). No additional resistance determinants were identified among mutations under selection pressure in non-candidate genes. INTERPRETATION: A broad catalogue of genetic mutations enable data from whole-genome sequencing to be used clinically to predict drug resistance, drug susceptibility, or to identify drug phenotypes that cannot yet be genetically predicted. This approach could be integrated into routine diagnostic workflows, phasing out phenotypic drug-susceptibility testing while reporting drug resistance early. FUNDING: Wellcome Trust, National Institute of Health Research, Medical Research Council, and the European Union.

Ford D, Robins JM, Petersen ML, Gibb DM, Gilks CF, Mugyenyi P, Grosskurth H, Hakim J, Katabira E, Babiker AG et al. 2015. The Impact of Different CD4 Cell-Count Monitoring and Switching Strategies on Mortality in HIV-Infected African Adults on Antiretroviral Therapy: An Application of Dynamic Marginal Structural Models. Am J Epidemiol, 182 (7), pp. 633-643. | Show Abstract | Read more

In Africa, antiretroviral therapy (ART) is delivered with limited laboratory monitoring, often none. In 2003-2004, investigators in the Development of Antiretroviral Therapy in Africa (DART) Trial randomized persons initiating ART in Uganda and Zimbabwe to either laboratory and clinical monitoring (LCM) or clinically driven monitoring (CDM). CD4 cell counts were measured every 12 weeks in both groups but were only returned to treating clinicians for management in the LCM group. Follow-up continued through 2008. In observational analyses, dynamic marginal structural models on pooled randomized groups were used to estimate survival under different monitoring-frequency and clinical/immunological switching strategies. Assumptions included no direct effect of randomized group on mortality or confounders and no unmeasured confounders which influenced treatment switch and mortality or treatment switch and time-dependent covariates. After 48 weeks of first-line ART, 2,946 individuals contributed 11,351 person-years of follow-up, 625 switches, and 179 deaths. The estimated survival probability after a further 240 weeks for post-48-week switch at the first CD4 cell count less than 100 cells/mm(3) or non-Candida World Health Organization stage 4 event (with CD4 count <250) was 0.96 (95% confidence interval (CI): 0.94, 0.97) with 12-weekly CD4 testing, 0.96 (95% CI: 0.95, 0.97) with 24-weekly CD4 testing, 0.95 (95% CI: 0.93, 0.96) with a single CD4 test at 48 weeks (baseline), and 0.92 (95% CI: 0.91, 0.94) with no CD4 testing. Comparing randomized groups by 48-week CD4 count, the mortality risk associated with CDM versus LCM was greater in persons with CD4 counts of <100 (hazard ratio = 2.4, 95% CI: 1.3, 4.3) than in those with CD4 counts of ≥100 (hazard ratio = 1.1, 95% CI: 0.8, 1.7; interaction P = 0.04). These findings support a benefit from identifying patients immunologically failing first-line ART at 48 weeks.

Jaunmuktane Z, Mead S, Ellis M, Wadsworth JDF, Nicoll AJ, Kenny J, Launchbury F, Linehan J, Richard-Loendt A, Walker AS et al. 2015. Evidence for human transmission of amyloid-β pathology and cerebral amyloid angiopathy. Nature, 525 (7568), pp. 247-250. | Show Abstract | Read more

More than two hundred individuals developed Creutzfeldt-Jakob disease (CJD) worldwide as a result of treatment, typically in childhood, with human cadaveric pituitary-derived growth hormone contaminated with prions. Although such treatment ceased in 1985, iatrogenic CJD (iCJD) continues to emerge because of the prolonged incubation periods seen in human prion infections. Unexpectedly, in an autopsy study of eight individuals with iCJD, aged 36-51 years, in four we found moderate to severe grey matter and vascular amyloid-β (Aβ) pathology. The Aβ deposition in the grey matter was typical of that seen in Alzheimer's disease and Aβ in the blood vessel walls was characteristic of cerebral amyloid angiopathy and did not co-localize with prion protein deposition. None of these patients had pathogenic mutations, APOE ε4 or other high-risk alleles associated with early-onset Alzheimer's disease. Examination of a series of 116 patients with other prion diseases from a prospective observational cohort study showed minimal or no Aβ pathology in cases of similar age range, or a decade older, without APOE ε4 risk alleles. We also analysed pituitary glands from individuals with Aβ pathology and found marked Aβ deposition in multiple cases. Experimental seeding of Aβ pathology has been previously demonstrated in primates and transgenic mice by central nervous system or peripheral inoculation with Alzheimer's disease brain homogenate. The marked deposition of parenchymal and vascular Aβ in these relatively young patients with iCJD, in contrast with other prion disease patients and population controls, is consistent with iatrogenic transmission of Aβ pathology in addition to CJD and suggests that healthy exposed individuals may also be at risk of iatrogenic Alzheimer's disease and cerebral amyloid angiopathy. These findings should also prompt investigation of whether other known iatrogenic routes of prion transmission may also be relevant to Aβ and other proteopathic seeds associated with neurodegenerative and other human diseases.

Arenas-Pinto A, Thompson J, Musoro G, Musana H, Lugemwa A, Kambugu A, Mweemba A, Atwongyeire D, Thomason MJ, Walker AS et al. 2016. Peripheral neuropathy in HIV patients in sub-Saharan Africa failing first-line therapy and the response to second-line ART in the EARNEST trial. J Neurovirol, 22 (1), pp. 104-113. | Show Abstract | Read more

Sensory peripheral neuropathy (PN) remains a common complication in HIV-positive patients despite effective combination anti-retroviral therapy (ART). Data on PN on second-line ART is scarce. We assessed PN using a standard tool in patients failing first-line ART and for 96 weeks following a switch to PI-based second-line ART in a large Randomised Clinical Trial in Sub-Saharan Africa. Factors associated with PN were investigated using logistic regression. Symptomatic PN (SPN) prevalence was 22% at entry (N = 1,251) and was associated (p < 0.05) with older age (OR = 1.04 per year), female gender (OR = 1.64), Tuberculosis (TB; OR = 1.86), smoking (OR = 1.60), higher plasma creatinine (OR = 1.09 per 0.1 mg/dl increase), CD4 count (OR = 0.83 per doubling) and not consuming alcohol (OR = 0.55). SPN prevalence decreased to 17% by week 96 (p = 0.0002) following similar trends in all study groups (p = 0.30). Asymptomatic PN (APN) increased over the same period from 21 to 29% (p = 0.0002). Signs suggestive of PN (regardless of symptoms) returned to baseline levels by week 96. At weeks 48 and 96, after adjusting for time-updated associations above and baseline CD4 count and viral load, SPN was strongly associated with TB (p < 0.0001). In summary, SPN prevalence was significantly reduced with PI-based second-line therapy across all treatment groups, but we did not find any advantage to the NRTI-free regimens. The increase of APN and stability of PN-signs regardless of symptoms suggest an underlying trend of neuropathy progression that may be masked by reduction of symptoms accompanying general health improvement induced by second-line ART. SPN was strongly associated with isoniazid given for TB treatment.

George EC, Walker AS, Kiguli S, Olupot-Olupot P, Opoka RO, Engoru C, Akech SO, Nyeko R, Mtove G, Reyburn H et al. 2015. Predicting mortality in sick African children: the FEAST Paediatric Emergency Triage (PET) Score. BMC Med, 13 (1), pp. 174. | Show Abstract | Read more

BACKGROUND: Mortality in paediatric emergency care units in Africa often occurs within the first 24 h of admission and remains high. Alongside effective triage systems, a practical clinical bedside risk score to identify those at greatest risk could contribute to reducing mortality. METHODS: Data collected during the Fluid As Expansive Supportive Therapy (FEAST) trial, a multi-centre trial involving 3,170 severely ill African children, were analysed to identify clinical and laboratory prognostic factors for mortality. Multivariable Cox regression was used to build a model in this derivation dataset based on clinical parameters that could be quickly and easily assessed at the bedside. A score developed from the model coefficients was externally validated in two admissions datasets from Kilifi District Hospital, Kenya, and compared to published risk scores using Area Under the Receiver Operating Curve (AUROC) and Hosmer-Lemeshow tests. The Net Reclassification Index (NRI) was used to identify additional laboratory prognostic factors. RESULTS: A risk score using 8 clinical variables (temperature, heart rate, capillary refill time, conscious level, severe pallor, respiratory distress, lung crepitations, and weak pulse volume) was developed. The score ranged from 0-10 and had an AUROC of 0.82 (95 % CI, 0.77-0.87) in the FEAST trial derivation set. In the independent validation datasets, the score had an AUROC of 0.77 (95 % CI, 0.72-0.82) amongst admissions to a paediatric high dependency ward and 0.86 (95 % CI, 0.82-0.89) amongst general paediatric admissions. This discriminative ability was similar to, or better than other risk scores in the validation datasets. NRI identified lactate, blood urea nitrogen, and pH to be important prognostic laboratory variables that could add information to the clinical score. CONCLUSIONS: Eight clinical prognostic factors that could be rapidly assessed by healthcare staff for triage were combined to create the FEAST Paediatric Emergency Triage (PET) score and externally validated. The score discriminated those at highest risk of fatal outcome at the point of hospital admission and compared well to other published risk scores. Further laboratory tests were also identified as prognostic factors which could be added if resources were available or as indices of severity for comparison between centres in future research studies.

Fitzpatrick JM, Biswas JS, Edgeworth JD, Islam J, Jenkins N, Judge R, Lavery AJ, Melzer M, Morris-Jones S, Nsutebu EF et al. 2016. Gram-negative bacteraemia; a multi-centre prospective evaluation of empiric antibiotic therapy and outcome in English acute hospitals. Clin Microbiol Infect, 22 (3), pp. 244-251. | Show Abstract | Read more

Increasing antibiotic resistance makes choosing antibiotics for suspected Gram-negative infection challenging. This study set out to identify key determinants of mortality among patients with Gram-negative bacteraemia, focusing particularly on the importance of appropriate empiric antibiotic treatment. We conducted a prospective observational study of 679 unselected adults with Gram-negative bacteraemia at ten acute english hospitals between October 2013 and March 2014. Appropriate empiric antibiotic treatment was defined as intravenous treatment on the day of blood culture collection with an antibiotic to which the cultured organism was sensitive in vitro. Mortality analyses were adjusted for patient demographics, co-morbidities and illness severity. The majority of bacteraemias were community-onset (70%); most were caused by Escherichia coli (65%), Klebsiella spp. (15%) or Pseudomonas spp. (7%). Main foci of infection were urinary tract (51%), abdomen/biliary tract (20%) and lower respiratory tract (14%). The main antibiotics used were co-amoxiclav (32%) and piperacillin-tazobactam (30%) with 34% receiving combination therapy (predominantly aminoglycosides). Empiric treatment was inappropriate in 34%. All-cause mortality was 8% at 7 days and 15% at 30 days. Independent predictors of mortality (p <0.05) included older age, greater burden of co-morbid disease, severity of illness at presentation and inflammatory response. Inappropriate empiric antibiotic therapy was not associated with mortality at either time-point (adjusted OR 0.82; 95% CI 0.35-1.94 and adjusted OR 0.92; 95% CI 0.50-1.66, respectively). Although our study does not exclude an impact of empiric antibiotic choice on survival in Gram-negative bacteraemia, outcome is determined primarily by patient and disease factors.

Musoke P, Szubert AJ, Musiime V, Nathoo K, Nahirya-Ntege P, Mutasa K, Williams DE, Prendergast AJ, Spyer M, Walker AS et al. 2015. Single-dose nevirapine exposure does not affect response to antiretroviral therapy in HIV-infected African children aged below 3 years. AIDS, 29 (13), pp. 1623-1632. | Show Abstract | Read more

OBJECTIVES: To assess the impact of exposure to single-dose nevirapine (sdNVP) on virological response in young Ugandan/Zimbabwean children (<3 years) initiating antiretroviral therapy (ART), and to investigate other predictors of response. DESIGN: Observational analysis within the ARROW randomized trial. METHODS: sdNVP exposure was ascertained by the caregiver's self-report when the child initiated non-nucleoside reverse transcriptase inhibitor (NNRTI)-based ART. Viral load was assayed retrospectively over a median 4.1 years of follow-up. Multivariable logistic regression models were used to identify independent predictors of viral load below 80 copies/ml, 48 and 144 weeks after ART initiation (backwards elimination, exit P = 0.1). RESULTS: Median (IQR) age at ART initiation was 17 (10-23) months in 78 sdNVP-exposed children vs. 21 (14-27) months in 289 non-exposed children (36 vs. 20% <12 months). At week 48, 49 of 73 (67%) sdNVP-exposed and 154 of 272 (57%) non-exposed children had viral load below 80 copies/ml [adjusted odds ratio (aOR) 2.34 (1.26-4.34), P = 0.007]; 79 and 77% had viral load below 400 copies/ml. Suppression was significantly lower in males (P = 0.009), those with higher pre-ART viral load (P = 0.001), taking syrups (P = 0.05) and with lower self-reported adherence (P = 0.04). At week 144, 55 of 73 (75%) exposed and 188 of 272 (69%) non-exposed children had less than 80 copies/ml [aOR 1.75 (0.93-3.29), P = 0.08]. There was no difference between children with and without previous sdNVP exposure in intermediate/high-level resistance to NRTIs (P > 0.3) or NNRTIs (P > 0.1) (n = 88) at week 144. CONCLUSION: Given the limited global availability of lopinavir/ritonavir, its significant formulation challenges in young children, and the significant paediatric treatment gap, tablet fixed-dose-combination NVP-based ART remains a good alternative to syrup lopinavir-based ART for children, particularly those over 1 year and even if exposed to sdNVP.

Vermund SH, Walker AS. 2016. Use of Pharmacokinetic Data in Novel Analyses to Determine the Effect of Topical Microbicides as Preexposure Prophylaxis Against HIV Infection. J Infect Dis, 213 (3), pp. 329-331. | Read more

Tonkin-Crine S, Walker AS, Butler CC. 2015. Contribution of behavioural science to antibiotic stewardship. BMJ, 350 (jun25 8), pp. h3413. | Read more

Planche T, Wilcox M, Walker AS. 2015. Fecal-Free Toxin Detection Remains the Best Way to Detect Clostridium difficile Infection. Clin Infect Dis, 61 (7), pp. 1210-1211. | Read more

Stoesser N, Sheppard AE, Moore CE, Golubchik T, Parry CM, Nget P, Saroeun M, Day NPJ, Giess A, Johnson JR et al. 2015. Extensive Within-Host Diversity in Fecally Carried Extended-Spectrum-Beta-Lactamase-Producing Escherichia coli Isolates: Implications for Transmission Analyses. J Clin Microbiol, 53 (7), pp. 2122-2131. | Show Abstract | Read more

Studies of the transmission epidemiology of antimicrobial-resistant Escherichia coli, such as strains harboring extended-spectrum beta-lactamase (ESBL) genes, frequently use selective culture of rectal surveillance swabs to identify isolates for molecular epidemiological investigation. Typically, only single colonies are evaluated, which risks underestimating species diversity and transmission events. We sequenced the genomes of 16 E. coli colonies from each of eight fecal samples (n = 127 genomes; one failure), taken from different individuals in Cambodia, a region of high ESBL-producing E. coli prevalence. Sequence data were used to characterize both the core chromosomal diversity of E. coli isolates and their resistance/virulence gene content as a proxy measure of accessory genome diversity. The 127 E. coli genomes represented 31 distinct sequence types (STs). Seven (88%) of eight subjects carried ESBL-positive isolates, all containing blaCTX-M variants. Diversity was substantial, with a median of four STs/individual (range, 1 to 10) and wide genetic divergence at the nucleotide level within some STs. In 2/8 (25%) individuals, the same blaCTX-M variant occurred in different clones, and/or different blaCTX-M variants occurred in the same clone. Patterns of other resistance genes and common virulence factors, representing differences in the accessory genome, were also diverse within and between clones. The substantial diversity among intestinally carried ESBL-positive E. coli bacteria suggests that fecal surveillance, particularly if based on single-colony subcultures, will likely underestimate transmission events, especially in high-prevalence settings.

Denning E, Sharma S, Smolskis M, Touloumi G, Walker S, Babiker A, Clewett M, Emanuel E, Florence E, Papadopoulos A et al. 2015. Reported consent processes and demographics: a substudy of the INSIGHT Strategic Timing of AntiRetroviral Treatment (START) trial. HIV Med, 16 Suppl 1 (S1), pp. 24-29. | Show Abstract | Read more

OBJECTIVES: Efforts are needed to improve informed consent of participants in research. The Strategic Timing of AntiRetroviral Therapy (START) study provides a unique opportunity to study the effect of length and complexity of informed consent documents on understanding and satisfaction among geographically diverse participants. METHODS: Interested START sites were randomized to use either the standard consent form or the concise consent form for all of the site's participants. RESULTS: A total of 4473 HIV-positive participants at 154 sites world-wide took part in the Informed Consent Substudy, with consent given in 11 primary languages. Most sites sent written information to potential participants in advance of clinic visits, usually including the consent form. At about half the sites, staff reported spending less than an hour per participant in the consent process. The vast majority of sites assessed participant understanding using informal nonspecific questions or clinical judgment. CONCLUSIONS: These data reflect the interest of START research staff in evaluating the consent process and improving informed consent. The START Informed Consent Substudy is by far the largest study of informed consent intervention ever conducted. Its results have the potential to impact how consent forms are written around the world.

Szubert AJ, Musiime V, Bwakura-Dangarembizi M, Nahirya-Ntege P, Kekitiinwa A, Gibb DM, Nathoo K, Prendergast AJ, Walker AS, ARROW Trial Team. 2015. Pubertal development in HIV-infected African children on first-line antiretroviral therapy. AIDS, 29 (5), pp. 609-618. | Show Abstract | Read more

OBJECTIVES: To estimate age at attaining Tanner stages in Ugandan/Zimbabwean HIV-infected children initiating antiretroviral therapy (ART) in older childhood and investigate predictors of delayed puberty, particularly age at ART initiation. DESIGN: Observational analysis within a randomized trial. METHODS: Tanner staging was assessed every 24 weeks from 10 years of age, menarche every 12 weeks and height every 4-6 weeks. Age at attaining different Tanner stages was estimated using normal interval regression, considering predictors using multivariable regression. Growth was estimated using multilevel models with child-specific intercepts and trajectories. RESULTS: Median age at ART initiation was 9.4 years (inter-quartile range 7.8, 11.3) (n = 582). At the first assessment, the majority (80.2%) were in Tanner stage 1; median follow-up with staging was 2.8 years. There was a strong delaying effect of older age at ART initiation on age at attaining all Tanner stages (P < 0.05) and menarche (P = 0.02); in boys the delaying effect generally weakened with older age. There were additional significant delays associated with greater impairments in pre-ART height-for-age Z-score (P < 0.05) in both sexes and pre-ART BMI-for-age in girls (P < 0.05). There was no evidence that pre-ART immuno-suppression independently delayed puberty or menarche. However, older children/adolescents had significant growth spurts in intermediate Tanner stages, and were still significantly increasing their height when in Tanner stage 5 (P < 0.01). CONCLUSION: Delaying ART initiation until older childhood substantially delays pubertal development and menarche, independently of immuno-suppression. This highlights that factors other than CD4, such as pubertal development, need consideration when making decisions about timing of ART initiation in older children.

Eyre DW, Tracey L, Elliott B, Slimings C, Huntington PG, Stuart RL, Korman TM, Kotsiou G, McCann R, Griffiths D et al. 2015. Emergence and spread of predominantly community-onset Clostridium difficile PCR ribotype 244 infection in Australia, 2010 to 2012. Euro Surveill, 20 (10), pp. 21059. | Show Abstract | Read more

We describe an Australia-wide Clostridium difficile outbreak in 2011 and 2012 involving the previously uncommon ribotype 244. In Western Australia, 14 of 25 cases were community-associated, 11 were detected in patients younger than 65 years, 14 presented to emergency/outpatient departments, and 14 to non-tertiary/community hospitals. Using whole genome sequencing, we confirm ribotype 244 is from the same C. difficile clade as the epidemic ribotype 027. Like ribotype 027, it produces toxins A, B, and binary toxin, however it is fluoroquinolone-susceptible and thousands of single nucleotide variants distinct from ribotype 027. Fifteen outbreak isolates from across Australia were sequenced. Despite their geographic separation, all were genetically highly related without evidence of geographic clustering, consistent with a point source, for example affecting the national food chain. Comparison with reference laboratory strains revealed the outbreak clone shared a common ancestor with isolates from the United States and United Kingdom (UK). A strain obtained in the UK was phylogenetically related to our outbreak. Follow-up of that case revealed the patient had recently returned from Australia. Our data demonstrate new C. difficile strains are an on-going threat, with potential for rapid spread. Active surveillance is needed to identify and control emerging lineages.

Eyre D, Tracey L, Elliott B, Slimings C, Huntington P, Stuart R, Korman T, Kotsiou G, McCann R, Griffiths D et al. 2015. Emergence and spread of predominantly community-onset Clostridium difficile PCR ribotype 244 infection in Australia, 2010 to 2012 Eurosurveillance, 20 (10), pp. 21059-21059. | Show Abstract | Read more

We describe an Australia-wide Clostridium difficile outbreak in 2011 and 2012 involving the previously uncommon ribotype 244. In Western Australia, 14 of 25 cases were community-associated, 11 were detected in patients younger than 65 years, 14 presented to emergency/outpatient departments, and 14 to non-tertiary/community hospitals. Using whole genome sequencing, we confirm ribotype 244 is from the same C. difficile clade as the epidemic ribotype 027. Like ribotype 027, it produces toxins A, B, and binary toxin, however it is fluoroquinolone-susceptible and thousands of single nucleotide variants distinct from ribotype 027. Fifteen outbreak isolates from across Australia were sequenced. Despite their geographic separation, all were genetically highly related without evidence of geographic clustering, consistent with a point source, for example affecting the national food chain. Comparison with reference laboratory strains revealed the outbreak clone shared a common ancestor with isolates from the United States and United Kingdom (UK). A strain obtained i n the UK was phylogenetically related to our outbreak. Follow-up of that case revealed the patient had recently returned from Australia. Our data demonstrate new C. difficile strains are an on-going threat, with potential for rapid spread. Active surveillance is needed to identify and control emerging lineages.

Votintseva AA, Pankhurst LJ, Anson LW, Morgan MR, Gascoyne-Binzi D, Walker TM, Quan TP, Wyllie DH, Del Ojo Elias C, Wilcox M et al. 2015. Mycobacterial DNA extraction for whole-genome sequencing from early positive liquid (MGIT) cultures. J Clin Microbiol, 53 (4), pp. 1137-1143. | Show Abstract | Read more

We developed a low-cost and reliable method of DNA extraction from as little as 1 ml of early positive mycobacterial growth indicator tube (MGIT) cultures that is suitable for whole-genome sequencing to identify mycobacterial species and predict antibiotic resistance in clinical samples. The DNA extraction method is based on ethanol precipitation supplemented by pretreatment steps with a MolYsis kit or saline wash for the removal of human DNA and a final DNA cleanup step with solid-phase reversible immobilization beads. The protocol yielded ≥0.2 ng/μl of DNA for 90% (MolYsis kit) and 83% (saline wash) of positive MGIT cultures. A total of 144 (94%) of the 154 samples sequenced on the MiSeq platform (Illumina) achieved the target of 1 million reads, with <5% of reads derived from human or nasopharyngeal flora for 88% and 91% of samples, respectively. A total of 59 (98%) of 60 samples that were identified by the national mycobacterial reference laboratory (NMRL) as Mycobacterium tuberculosis were successfully mapped to the H37Rv reference, with >90% coverage achieved. The DNA extraction protocol, therefore, will facilitate fast and accurate identification of mycobacterial species and resistance using a range of bioinformatics tools.

Church JA, Fitzgerald F, Walker AS, Gibb DM, Prendergast AJ. 2015. The expanding role of co-trimoxazole in developing countries. Lancet Infect Dis, 15 (3), pp. 327-339. | Show Abstract | Read more

Co-trimoxazole is an inexpensive, broad-spectrum antimicrobial drug that is widely used in developing countries. Before antiretroviral therapy (ART) scale-up, co-trimoxazole prophylaxis reduced morbidity and mortality in adults and children with HIV by preventing bacterial infections, diarrhoea, malaria, and Pneumocystis jirovecii pneumonia, despite high levels of microbial resistance. Co-trimoxazole prophylaxis reduces early mortality by 58% (95% CI 39-71) in adults starting ART. Co-trimoxazole provides ongoing protection against malaria and non-malaria infections after immune reconstitution in ART-treated individuals in sub-Saharan Africa, leading to a change in WHO guidelines, which now recommend long-term co-trimoxazole prophylaxis for adults and children in settings with a high prevalence of malaria or severe bacterial infections. Co-trimoxazole prophylaxis is recommended for HIV-exposed infants from age 4-6 weeks; however, the risks and benefits of co-trimoxazole during infancy are unclear. Co-trimoxazole prophylaxis reduces anaemia and improves growth in children with HIV, possibly by reducing inflammation, either through direct immunomodulatory activity or through effects on the intestinal microbiota leading to reduced microbial translocation. Ongoing trials are now assessing the ability of adjunctive co-trimoxazole to reduce mortality in children after severe anaemia or severe acute malnutrition. In this Review, we discuss the mechanisms of action, benefits and risks, and clinical trials of co-trimoxazole in developing countries.

Stoesser N, Sheppard AE, Shakya M, Sthapit B, Thorson S, Giess A, Kelly D, Pollard AJ, Peto TEA, Walker AS, Crook DW. 2015. Dynamics of MDR Enterobacter cloacae outbreaks in a neonatal unit in Nepal: insights using wider sampling frames and next-generation sequencing. J Antimicrob Chemother, 70 (4), pp. 1008-1015. | Show Abstract | Read more

OBJECTIVES: There are limited data on Enterobacter cloacae outbreaks and fewer describing these in association with NDM-1. With whole-genome sequencing, we tested the hypothesis that a cluster of 16 E. cloacae bacteraemia cases in a Nepali neonatal unit represented a single clonal outbreak, using a wider set of epidemiologically unrelated clinical E. cloacae isolates for comparison. METHODS: Forty-three isolates were analysed, including 23 E. cloacae and 3 Citrobacter sp. isolates obtained from blood cultures from 16 neonates over a 3 month period. These were compared with two contemporaneous community-associated drug-resistant isolates from adults, a unit soap dispenser isolate and a set of historical invasive isolates (n=14) from the same geographical locality. RESULTS: There were two clear neonatal outbreaks and one isolated case in the unit. One outbreak was associated with an NDM-1 plasmid also identified in a historical community-associated strain. The smaller, second outbreak was likely associated with a contaminated soap dispenser. The two community-acquired adult cases and three sets of historical hospital-associated neonatal isolates represented four additional genetic clusters. CONCLUSIONS: E. cloacae infections in this context represent several different transmission networks, operating at the community/hospital and host strain/plasmid levels. Wide sampling frames and high-resolution typing methods are needed to describe the complex molecular epidemiology of E. cloacae outbreaks, which is not appropriately reflected by routine susceptibility phenotypes. Soap dispensers may represent a reservoir for E. cloacae and bacterial strains and plasmids may persist in hospitals and in the community for long periods, sporadically being involved in outbreaks of disease.

Llewelyn MJ, Hand K, Hopkins S, Walker AS. 2015. Antibiotic policies in acute English NHS trusts: implementation of 'Start Smart-Then Focus' and relationship with Clostridium difficile infection rates. J Antimicrob Chemother, 70 (4), pp. 1230-1235. | Show Abstract | Read more

OBJECTIVES: The objective of this study was to establish how antibiotic prescribing policies at National Health Service (NHS) hospitals match the England Department of Health 'Start Smart-Then Focus' recommendations and relate to Clostridium difficile infection (CDI) rates. METHODS: Antibiotic pharmacists were surveyed regarding recommendations for empirical treatment of common syndromes ('Start Smart') and antimicrobial prescription reviews ('Focus') at their hospital trusts. If no response was provided, policy data were sought from trust websites and the MicroGuide app (Horizon Strategic Partners, UK). Empirical treatment recommendations were categorized as broad spectrum (a β-lactam penicillin/β-lactamase inhibitor, cephalosporin, quinolone or carbapenem) or narrow spectrum. CDI rates were gathered from the national mandatory surveillance system. RESULTS: Data were obtained for 105/145 English acute hospital trusts (72%). β-Lactam/β-lactamase inhibitor combinations were recommended extensively. Only for severe community-acquired pneumonia and pyelonephritis were narrow-spectrum agents recommended first line at a substantial number of trusts [42/105 (40%) and 50/105 (48%), respectively]. Policies commonly recommended dual therapy with aminoglycosides and β-lactams for abdominal sepsis [40/93 trusts (43%)] and undifferentiated severe sepsis [54/94 trusts (57%)]. Most policies recommended treating for ≥ 7 days for most indications. Nearly all policies [100/105 trusts (95%)] recommended antimicrobial prescription reviews, but only 46/96 respondents (48%) reported monitoring compliance. Independent predictors of higher CDI rates were recommending a broad-spectrum regimen for community-acquired pneumonia (P=0.06) and, counterintuitively, a recommended treatment duration of <48 h for nosocomial pneumonia (P=0.01). CONCLUSIONS: Hospital antibiotic policies in the NHS 'Start Smart' by recommending broad-spectrum antibiotics for empirical therapy, but this may have the unintended potential to increase the use of broad-spectrum antibiotics and risk of CDI unless better mechanisms are in place to improve 'Focus'.

Revill PA, Walker S, Mabugu T, Nathoo KJ, Mugyenyi P, Kekitinwa A, Munderi P, Bwakura-Dangarembizi M, Musiime V, Bakeera-Kitaka S et al. 2015. Opportunities for improving the efficiency of paediatric HIV treatment programmes. AIDS, 29 (2), pp. 201-210. | Show Abstract | Read more

OBJECTIVES: To conduct two economic analyses addressing whether to: routinely monitor HIV-infected children on antiretroviral therapy (ART) clinically or with laboratory tests; continue or stop cotrimoxazole prophylaxis when children become stabilized on ART. DESIGN AND METHODS: The ARROW randomized trial investigated alternative strategies to deliver paediatric ART and cotrimoxazole prophylaxis in 1206 Ugandan/Zimbabwean children. Incremental cost-effectiveness and value of implementation analyses were undertaken. Scenario analyses investigated whether laboratory monitoring (CD4 tests for efficacy monitoring; haematology/biochemistry for toxicity) could be tailored and targeted to be delivered cost-effectively. Cotrimoxazole use was examined in malaria-endemic and non-endemic settings. RESULTS: Using all trial data, clinical monitoring delivered similar health outcomes to routine laboratory monitoring, but at a reduced cost, so was cost-effective. Continuing cotrimoxazole improved health outcomes at reduced costs. Restricting routine CD4 monitoring to after 52 weeks following ART initiation and removing toxicity testing was associated with an incremental cost-effectiveness ratio of $6084 per quality-adjusted life-year (QALY) across all age groups, but was much lower for older children (12+ years at initiation; incremental cost-effectiveness ratio = $769/QALY). Committing resources to improve cotrimoxazole implementation appears cost-effective. A healthcare system that could pay $600/QALY should be willing to spend up to $12.0 per patient-year to ensure continued provision of cotrimoxazole. CONCLUSION: Clinically driven monitoring of ART is cost-effective in most circumstances. Routine laboratory monitoring is generally not cost-effective at current prices, except possibly CD4 testing amongst adolescents initiating ART. Committing resources to ensure continued provision of cotrimoxazole in health facilities is more likely to represent an efficient use of resources.

Kiwuwa-Muyingo S, Kikaire B, Mambule I, Musana H, Musoro G, Gilks CF, Levin JB, Walker AS. 2014. Prevalence, incidence and predictors of peripheral neuropathy in African adults with HIV infection within the DART trial. AIDS, 28 (17), pp. 2579-2588. | Show Abstract | Read more

OBJECTIVES: We investigated the prevalence, incidence and predictors of new peripheral neuropathy episodes in previously untreated, symptomatic HIV-infected Ugandan/Zimbabwean adults initiating zidovudine-based antiretroviral therapy (ART). DESIGN: An open-label, multicentre, randomized trial. METHODS: Peripheral neuropathy was self-reported at 12-weekly clinic visits. Cox regression models (excluding participants reporting preexisting peripheral neuropathy at ART initiation), considered sex; pre-ART WHO stage, age and CD4(+) cell count; CD4(+) cell count versus no CD4(+) cell count monitoring; and time-updated CD4(+) cell count, weight and use of stavudine, isoniazid and didanosine. RESULTS: Four hundred and twenty-one out of 3316(13%) patients reported preexisting peripheral neuropathy at ART initiation. Median (interquartile range, IQR) follow-up in 2895 participants without preexisting peripheral neuropathy was 4.9 (4.7-5.4) years. Three hundred and fifty-four (12%) took stavudine as first-line substitution and 518 (18%) took isoniazid during follow-up. Two hundred and ninety (11%) participants developed a new peripheral neuropathy episode, an incidence of 2.12 per 100 person-years. Eighteen (0.1%) had a grade 3/4 episode. Independent predictors of peripheral neuropathy were current stavudine use [adjusted hazard ratio (a)HR 4.16 (95% confidence interval, 95% CI 3.06-5.66], current isoniazid use [aHR 1.59 (95% CI 1.02-2.47)] and current didanosine use [aHR 1.60 (95% CI 1.19-2.14)]. Higher risks were independently associated with higher pre-ART weight [aHR (per+5 kg) 1.07 (95% CI 1.01-1.13)] and older age aHR (per 10 years older) 1.29 (95% CI 1.12-1.49), but there was no significant effect of sex (P = 0.13), pre-ART CD4(+) cell count (P = 0.91) or CD4(+) cell count monitoring (P = 0.73). CONCLUSION: Current stavudine, didanosine or isoniazid use continue to increase peripheral neuropathy risks, as does older age and weight at ART initiation; however, we found no evidence of increased risk in women in contrast to previous studies. The incidence of peripheral neuropathy may now be lower in ART programmes, as stavudine and didanosine are no longer recommended. All patients receiving isoniazid, either as part of antituberculosis (TB) chemotherapy or TB-preventive therapy, should receive pyridoxine as recommended in national guidelines.

Gomo ZA, Hakim JG, Walker SA, Tinago W, Mandozana G, Kityo C, Munderi P, Katabira E, Reid A, Gibb DM et al. 2014. Impact of second-line antiretroviral regimens on lipid profiles in an African setting: the DART trial sub-study. AIDS Res Ther, 11 (1), pp. 32. | Show Abstract | Read more

BACKGROUND: Increasing numbers of HIV-infected patients in sub-Saharan Africa are exposed to antiretroviral therapy (ART), but there are few data on lipid changes on first-line ART, and even fewer on second-line. METHODS: DART was a randomized trial comparing monitoring strategies in Ugandan/Zimbabwean adults initiating first-line ART and switching to second-line at clinical/immunological failure. We evaluated fasting lipid profiles at second-line initiation and ≥48 weeks subsequently in stored samples from Zimbabwean patients switching before 18 September 2006. RESULTS: Of 91 patients switched to second-line ART, 65(73%) had fasting samples at switch and ≥48 weeks, 14(15%) died or were lost <48 weeks, 10(11%) interrupted ART for >14 days and 2(2%) had no samples available. 56/65(86%) received ZDV/d4T + 3TC + TDF first-line, 6(9%) ZDV/d4T + 3TC + NVP and 3(5%) ZDV + 3TC with TDF and NVP. Initial second-line regimens were LPV/r + NNRTI in 27(41%), LPV/r + NNRTI + ddI in 33(50%) and LPV/r + TDF + ddI/3TC/ZDV in 6(9%). At second-line initiation median (IQR) TC, LDL-C, HDL-C and TG (mmol/L) were 3.3(2.8-4.0), 1.7(1.3-2.2), 0.7(0.6-0.9) and 1.1(0.8-1.9) respectively. Levels were significantly increased 48 weeks later, by mean (SE) +2.0(0.1), +1.1(0.1), +0.5(0.05) and +0.4(0.2) respectively (p < 0.001; TG p = 0.01). 3% at switch vs 25% 48 weeks later had TC >5.2 mmol/L; 3% vs 25% LDL-C >3.4 mmol/L and 91% vs 41% HDL-C <1.1 mmol/L (p < 0.001). Similar proportions had TG >1.8 mmol/L (0 vs 3%) and TC/HDL-C ≥5 (40% vs 33%) (p > 0.15). CONCLUSION: Modest lipid elevations were observed in African patients on predominantly LPV/r + NNRTI-based second-line regimens. Routine lipid monitoring during second-line ART regimens may not be warranted in this setting but individual cardiovascular risk assessment should guide practice.

Stoesser N, Giess A, Batty EM, Sheppard AE, Walker AS, Wilson DJ, Didelot X, Bashir A, Sebra R, Kasarskis A et al. 2014. Genome sequencing of an extended series of NDM-producing Klebsiella pneumoniae isolates from neonatal infections in a Nepali hospital characterizes the extent of community- versus hospital-associated transmission in an endemic setting. Antimicrob Agents Chemother, 58 (12), pp. 7347-7357. | Show Abstract | Read more

NDM-producing Klebsiella pneumoniae strains represent major clinical and infection control challenges, particularly in resource-limited settings with high rates of antimicrobial resistance. Determining whether transmission occurs at a gene, plasmid, or bacterial strain level and within hospital and/or the community has implications for monitoring and controlling spread. Whole-genome sequencing (WGS) is the highest-resolution typing method available for transmission epidemiology. We sequenced carbapenem-resistant K. pneumoniae isolates from 26 individuals involved in several infection case clusters in a Nepali neonatal unit and 68 other clinical Gram-negative isolates from a similar time frame, using Illumina and PacBio technologies. Within-outbreak chromosomal and closed-plasmid structures were generated and used as data set-specific references. Three temporally separated case clusters were caused by a single NDM K. pneumoniae strain with a conserved set of four plasmids, one being a 304,526-bp plasmid carrying bla(NDM-1). The plasmids contained a large number of antimicrobial/heavy metal resistance and plasmid maintenance genes, which may have explained their persistence. No obvious environmental/human reservoir was found. There was no evidence of transmission of outbreak plasmids to other Gram-negative clinical isolates, although bla(NDM) variants were present in other isolates in different genetic contexts. WGS can effectively define complex antimicrobial resistance epidemiology. Wider sampling frames are required to contextualize outbreaks. Infection control may be effective in terminating outbreaks caused by particular strains, even in areas with widespread resistance, although this study could not demonstrate evidence supporting specific interventions. Larger, detailed studies are needed to characterize resistance genes, vectors, and host strains involved in disease, to enable effective intervention.

Pankhurst L, Macfarlane-Smith L, Buchanan J, Anson L, Davies K, O'Connor L, Ashwin H, Pike G, Dingle KE, Peto TE et al. 2014. Can rapid integrated polymerase chain reaction-based diagnostics for gastrointestinal pathogens improve routine hospital infection control practice? A diagnostic study. Health Technol Assess, 18 (53), pp. 1-167. | Show Abstract | Read more

BACKGROUND: Every year approximately 5000-9000 patients are admitted to a hospital with diarrhoea, which in up to 90% of cases has a non-infectious cause. As a result, single rooms are 'blocked' by patients with non-infectious diarrhoea, while patients with infectious diarrhoea are still in open bays because of a lack of free side rooms. A rapid test for differentiating infectious from non-infectious diarrhoea could be very beneficial for patients. OBJECTIVE: To evaluate MassCode multiplex polymerase chain reaction (PCR) for the simultaneous diagnosis of multiple enteropathogens directly from stool, in terms of sensitivity/specificity to detect four common important enteropathogens: Clostridium difficile, Campylobacter spp., Salmonella spp. and norovirus. DESIGN: A retrospective study of fixed numbers of samples positive for C. difficile (n = 200), Campylobacter spp. (n = 200), Salmonella spp. (n = 100) and norovirus (n = 200) plus samples negative for all these pathogens (n = 300). Samples were sourced from NHS microbiology laboratories in Oxford and Leeds where initial diagnostic testing was performed according to Public Health England methodology. Researchers carrying out MassCode assays were blind to this information. A questionnaire survey, examining current practice for infection control teams and microbiology laboratories managing infectious diarrhoea, was also carried out. SETTING: MassCode assays were carried out at Oxford University Hospitals NHS Trust. Further multiplex assays, carried out using Luminex, were run on the same set of samples at Leeds Teaching Hospitals NHS Trust. The questionnaire was completed by various NHS trusts. MAIN OUTCOME MEASURES: Sensitivity and specificity to detect C. difficile, Campylobacter spp., Salmonella spp., and norovirus. RESULTS: Nucleic acids were extracted from 948 clinical samples using an optimised protocol (200 Campylobacter spp., 199 C. difficile, 60 S. enterica, 199 norovirus and 295 negative samples; some samples contained more than one pathogen). Using the MassCode assay, sensitivities for each organism compared with standard microbiological testing ranged from 43% to 94% and specificities from 95% to 98%, with particularly poor performance for S. enterica. Relatively large numbers of unexpected positives not confirmed with quantitative PCR were also observed, particularly for S. enterica, Giardia lamblia and Cryptosporidium spp. As the results indicated that S. enterica detection might provide generic challenges to other multiplex assays for gastrointestinal pathogens, the Luminex xTag(®) gastrointestinal assay was also run blinded on the same extracts (937/948 remaining) and on re-extracted samples (839/948 with sufficient material). For Campylobacter spp., C. difficile and norovirus, high sensitivities (> 92%) and specificities (> 96%) were observed. For S. enterica, on the original MassCode/Oxford extracts, Luminex sensitivity compared with standard microbiological testing was 84% [95% confidence interval (CI) 73% to 93%], but this dropped to 46% on a fresh extract, very similar to MassCode, with a corresponding increase in specificity from 92% to 99%. Overall agreement on the per-sample diagnosis compared with combined microbiology plus PCR for the main four/all pathogens was 85.6%/64.7%, 87.0%/82.9% and 89.8%/86.8% for the MassCode assay, Luminex assay/MassCode extract and Luminex assay/fresh extract, respectively. Luminex assay results from fresh extracts implied that 5% of samples did not represent infectious diarrhoea, even though enteropathogens were genuinely present. Managing infectious diarrhoea was a significant burden for infection control teams (taking 21% of their time) and better diagnostics were identified as having major potential benefits for patients. CONCLUSIONS: Overall, the Luminex xTag gastrointestinal panel showed similar or superior sensitivity and specificity to the MassCode assay. However, on fresh extracts, this test had low sensitivity to detect a key enteric pathogen, S. enterica; making it an unrealistic option for most microbiology laboratories. Extraction efficiency appears to be a major obstacle for nucleic acid-based tests for this organism, and possibly the whole Enterobacteriaceae family. To improve workflows in service microbiology laboratories, to reduce workload for infection control practitioners, and to improve outcomes for NHS patients, further research on deoxyribonucleic acid-based multiplex gastrointestinal diagnostics is urgently needed. FUNDING: The Health Technology Assessment programme of the National Institute for Health Research.

Bwakura-Dangarembizi M, Musiime V, Szubert AJ, Prendergast AJ, Gomo ZA, Thomason MJ, Musarurwa C, Mugyenyi P, Nahirya P, Kekitiinwa A et al. 2015. Prevalence of lipodystrophy and metabolic abnormalities in HIV-infected African children after 3 years on first-line antiretroviral therapy. Pediatr Infect Dis J, 34 (2), pp. e23-e31. | Show Abstract | Read more

BACKGROUND: Most pediatric lipodystrophy data come from high-income/middle-income countries, but most HIV-infected children live in sub-Saharan Africa, where lipodystrophy studies have predominantly investigated stavudine-based regimens. METHODS: Three years after antiretroviral therapy (ART) initiation, body circumferences and skinfold thicknesses were measured (n = 590), and fasted lipid profile assayed (n = 325), in children from 2 ARROW trial centres in Uganda/Zimbabwe. Analyses compared randomization to long-term versus short-term versus no zidovudine from ART initiation [unadjusted; latter 2 groups receiving abacavir+lamivudine+non-nucleoside-reverse-transciptase-inhibitor (nNRTI) long-term], and nonrandomized (confounder-adjusted) receipt of nevirapine versus efavirenz. RESULTS: Body circumferences and skinfold thicknesses were similar regardless of zidovudine exposure (P > 0.1), except for subscapular and supra-iliac skinfolds-for-age which were greater with long-term zidovudine (0.006 < P < 0.047). Circumferences/skinfolds were also similar with efavirenz and nevirapine (adjusted P > 0.09; 0.02 < P < 0.03 for waist/waist-hip-ratio). Total and high-density lipoprotein (HDL)-cholesterol, HDL/triglyceride-ratio (P < 0.0001) and triglycerides (P = 0.01) were lower with long-term zidovudine. Low-density lipoprotein (LDL)-cholesterol was higher with efavirenz than nevirapine (P < 0.001). Most lipids remained within normal ranges (75% cholesterol, 85% LDL and 100% triglycerides) but more on long-term zidovudine (3 NRTI) had abnormal HDL-cholesterol (88% vs. 40% short/no-zidovudine, P < 0.0001). Only 8/579(1.4%) children had clinical fat wasting (5 grade 1; 3 grade 2); 2(0.3%) had grade 1 fat accumulation. CONCLUSIONS: Long-term zidovudine-based ART is associated with similar body circumferences and skinfold thicknesses to abacavir-based ART, with low rates of lipid abnormalities and clinical lipodystrophy, providing reassurance where national programs now recommend long-term zidovudine. Efavirenz and nevirapine were also similar; however, the higher LDL observed with efavirenz and lower HDL observed with zidovudine suggests that zidovudine+lamivudine+efavirenz should be investigated in future.

Paton NI, Kityo C, Hoppe A, Reid A, Kambugu A, Lugemwa A, van Oosterhout JJ, Kiconco M, Siika A, Mwebaze R et al. 2014. Assessment of second-line antiretroviral regimens for HIV therapy in Africa. N Engl J Med, 371 (3), pp. 234-247. | Show Abstract | Read more

BACKGROUND: The efficacy and toxic effects of nucleoside reverse-transcriptase inhibitors (NRTIs) are uncertain when these agents are used with a protease inhibitor in second-line therapy for human immunodeficiency virus (HIV) infection in resource-limited settings. Removing the NRTIs or replacing them with raltegravir may provide a benefit. METHODS: In this open-label trial in sub-Saharan Africa, we randomly assigned 1277 adults and adolescents with HIV infection and first-line treatment failure to receive a ritonavir-boosted protease inhibitor (lopinavir-ritonavir) plus clinician-selected NRTIs (NRTI group, 426 patients), a protease inhibitor plus raltegravir in a superiority comparison (raltegravir group, 433 patients), or protease-inhibitor monotherapy after 12 weeks of induction therapy with raltegravir in a noninferiority comparison (monotherapy group, 418 patients). The primary composite end point, good HIV disease control, was defined as survival with no new World Health Organization stage 4 events, a CD4+ count of more than 250 cells per cubic millimeter, and a viral load of less than 10,000 copies per milliliter or 10,000 copies or more with no protease resistance mutations at week 96 and was analyzed with the use of imputation of data (≤4%). RESULTS: Good HIV disease control was achieved in 60% of the patients (mean, 255 patients) in the NRTI group, 64% of the patients (mean, 277) in the raltegravir group (P=0.21 for the comparison with the NRTI group; superiority of raltegravir not shown), and 55% of the patients (mean, 232) in the monotherapy group (noninferiority of monotherapy not shown, based on a 10-percentage-point margin). There was no significant difference in rates of grade 3 or 4 adverse events among the three groups (P=0.82). The viral load was less than 400 copies per milliliter in 86% of patients in the NRTI group, 86% in the raltegravir group (P=0.97), and 61% in the monotherapy group (P<0.001). CONCLUSIONS: When given with a protease inhibitor in second-line therapy, NRTIs retained substantial virologic activity without evidence of increased toxicity, and there was no advantage to replacing them with raltegravir. Virologic control was inferior with protease-inhibitor monotherapy. (Funded by European and Developing Countries Clinical Trials Partnership and others; EARNEST Current Controlled Trials number, ISRCTN37737787, and ClinicalTrials.gov number, NCT00988039.).

Musiime V, Fillekes Q, Kekitiinwa A, Kendall L, Keishanyu R, Namuddu R, Young N, Opilo W, Lallemant M, Walker AS et al. 2014. The pharmacokinetics and acceptability of lopinavir/ritonavir minitab sprinkles, tablets, and syrups in african HIV-infected children. J Acquir Immune Defic Syndr, 66 (2), pp. 148-154. | Show Abstract | Read more

BACKGROUND: Guidelines recommend lopinavir/ritonavir (LPV/r) as first- and second-line therapy for young and older HIV-infected children, respectively. Available formulations have limitations making their widespread use complex. METHODS: An open-label comparative bioavailability (randomized crossover) study compared a novel twice-daily minitab sprinkle formulation (40 mg/10 mg, Cipla Pharmaceuticals) versus innovator syrup in HIV-infected Ugandan infants aged 3 to <12 months (cohort A) and children aged 1-4 years (cohort B) and versus Cipla tablets (100/25 mg) in children aged 4 to <13 years (cohort C). Twelve-hour intensive pharmacokinetic sampling after observed LPV/r intake (plus 2 nucleoside reverse transcriptase inhibitors) following World Health Organization 2010 dosing with food was performed 4 weeks after enrollment. Children then switched formulation; sampling was repeated at week 8. Acceptability data were also collected. RESULTS: Seventy-seven infants/children were included in cohort A (n = 19)/B (n = 26)/C (n = 32). Among 132 evaluable pharmacokinetic profiles, there were 13/21/25 within-child comparisons in cohort A/B/C. For minitabs versus syrup, geometric mean [95% confidence interval (CI)] AUC0-12h was 88.6 (66.7-117.6) versus 77.6 (49.5-121.5) h·mg/L in cohort A [geometric mean ratio (GMR) (90% CI) = 1.14 (0.71 to 1.85)] and 138.7 (118.2 to 162.6) versus 109.1 (93.7 to 127.1) h·mg/L in cohort B [GMR (90% CI) = 1.27 (1.10 to 1.46)]. For minitabs versus tablets, geometric mean (95% CI) AUC0-12h was 83.1 (66.7 to 103.5) versus 115.6 (103.0 to 129.7) h·mg/L; GMR (90% CI) = 0.72 (0.60 to 0.86). Subtherapeutic levels (<1.0 mg/L) occurred in 0 (0%)/2 (15%) minitabs/syrup in infants (P = 0.48), no children aged 1-4 years and 4 (16%)/1 (4%) minitabs/tablets (P = 0.35). About 13/17 (76%) and 19/26 (73%) caregivers of infants and children aged 1-4 years, respectively, chose to continue minitabs after week 8, mainly for convenience; only 7/29 (24%) older children (five <6 years) remained on minitabs. CONCLUSIONS: LPV/r exposure from minitabs was comparable with syrup, but lower than tablets, with no significant differences in subtherapeutic concentrations. Minitabs were more acceptable than syrups for younger children, but older children preferred tablets.

Jaganath D, Walker AS, Ssali F, Musiime V, Kiweewa F, Kityo C, Salata R, Mugyenyi P, DART Trial, ARROW Trial. 2014. HIV-associated anemia after 96 weeks on therapy: determinants across age ranges in Uganda and Zimbabwe. AIDS Res Hum Retroviruses, 30 (6), pp. 523-530. | Show Abstract | Read more

Given the detrimental effects of HIV-associated anemia on morbidity, we determined factors associated with anemia after 96 weeks of antiretroviral therapy (ART) across age groups. An HIV-positive cohort (n=3,580) of children age 5-14, reproductive age adults 18-49, and older adults ≥50 from two randomized trials in Uganda and Zimbabwe were evaluated from initiation of therapy through 96 weeks. We conducted logistic and multinomial regression to evaluate common and differential determinants for anemia at 96 weeks on therapy. Prior to initiation of ART, the prevalence of anemia (age 5-11 <10.5 g/dl, 12-14 <11 g/dl, adult females <11 g/dl, adult males <12 g/dl) was 43%, which decreased to 13% at week 96 (p<0.001). Older adults had a significantly higher likelihood of anemia compared to reproductive age adults (OR 2.60, 95% CI 1.44-4.70, p=0.002). Reproductive age females had a significantly higher odds of anemia compared to men at week 96 (OR 2.56, 95% CI 1.92-3.40, p<0.001), and particularly a greater odds for microcytic anemia compared to males in the same age group (p=0.001). Other common factors associated with anemia included low body mass index (BMI) and microcytosis; greater increases in CD4 count to week 96 were protective. Thus, while ART significantly reduced the prevalence of anemia at 96 weeks, 13% of the population continued to be anemic. Specific groups, such as reproductive age females and older adults, have a greater odds of anemia and may guide clinicians to pursue further evaluation and management.

Price JR, Golubchik T, Wilson DJ, Crook DW, Walker AS, Peto TEA, Paul J, Llewelyn MJ. 2014. Reply to Mills and Linkin. Clin Infect Dis, 59 (5), pp. 752-753. | Read more

Buchanan J, Wordsworth S, O'Connor L, Pike G, Walker AS, Wilcox MH, Crook DW. 2015. Management of patients with suspected infectious diarrhoea in hospitals in England. J Hosp Infect, 90 (3), pp. 199-207. | Show Abstract | Read more

BACKGROUND: Advances in molecular and genomic testing for patients with suspected infectious diarrhoea are on the horizon. It is important to understand how infection control and microbiology departments currently operate with respect to the management of these patients in order to assess the implications of more widespread diagnostic testing. However, there are few data available on current practice in this context. AIM: To describe current infection control and microbiologist practice across England with respect to the management of patients with suspected infectious diarrhoea. METHODS: Hospitals in England completed three questionnaires on current testing practice in this context. Questionnaire design was informed by current practice within the Oxford University Hospitals group. FINDINGS: Forty-one percent of hospitals completed at least one questionnaire. A notable proportion of staff time was devoted to the management of patients with suspected infectious diarrhoea. Staff training was generally good, but compliance with policy documents was only 80%. Cleaning and isolation policies varied across hospitals, suggesting that either these were not evidence-based, or that the evidence base is weak. There was more agreement on outbreak definitions, management, and cohorting policies. Stool-testing decisions were mainly driven by patient characteristics, whereas strain typing was infrequently used (except to investigate Clostridium difficile infections). Multiple practical difficulties associated with patient management were identified, along with a clear appetite for more widespread genomic diagnostic testing. CONCLUSION: Managing patients with suspected infectious diarrhoea is a major burden in England. Advances in testing practice in this context could have significant clinical and economic impacts.

Miller RR, Walker AS, Godwin H, Fung R, Votintseva A, Bowden R, Mant D, Peto TEA, Crook DW, Knox K. 2014. Dynamics of acquisition and loss of carriage of Staphylococcus aureus strains in the community: the effect of clonal complex. J Infect, 68 (5), pp. 426-439. | Show Abstract | Read more

BACKGROUND: Staphylococcus aureus nasal carriage increases infection risk. However, few studies have investigated S. aureus acquisition/loss over >1 year, and fewer still used molecular typing. METHODS: 1123 adults attending five Oxfordshire general practices had nasal swabs taken. 571 were re-swabbed after one month then every two months for median two years. All S. aureus isolates were spa-typed. Risk factors were collected from interviews and medical records. RESULTS: 32% carried S. aureus at recruitment (<1% MRSA). Rates of spa-type acquisition were similar in participants S. aureus positive (1.4%/month) and negative (1.8%/month, P = 0.13) at recruitment. Rates were faster in those carrying clonal complex (CC)15 (adjusted (a)P = 0.03) or CC8 (including USA300) (aP = 0.001) at recruitment versus other CCs. 157/274 (57%) participants S. aureus positive at recruitment returning ≥ 12 swabs carried S. aureus consistently, of whom 135 carried the same spa-type. CC22 (including EMRSA-15) was more prevalent in long-term than intermittent spa-type carriers (aP = 0.03). Antibiotics transiently reduced carriage, but no other modifiable risk factors were found. CONCLUSIONS: Both transient and longer-term carriage exist; however, the approximately constant rates of S. aureus gain and loss suggest that 'never' or truly 'persistent' carriage are rare. Long-term carriage varies by strain, offering new explanations for the success of certain S. aureus clones.

Fillekes Q, Kendall L, Kitaka S, Mugyenyi P, Musoke P, Ndigendawani M, Bwakura-Dangarembizi M, Gibb DM, Burger D, Walker AS, ARROW Trial Team. 2014. Pharmacokinetics of zidovudine dosed twice daily according to World Health Organization weight bands in Ugandan HIV-infected children. Pediatr Infect Dis J, 33 (5), pp. 495-498. | Show Abstract | Read more

Data on zidovudine pharmacokinetics in children dosed using World Health Organization weight bands are limited. About 45 HIV-infected, Ugandan children, 3.4 (2.6-6.2) years, had intensive pharmacokinetic sampling. Geometric mean zidovudine AUC0-12h was 3.0 h.mg/L, which is higher than previously observed in adults, and was independently higher in those receiving higher doses, younger and underweight children. Higher exposure was also marginally associated with lower hemoglobin.

Everitt RG, Didelot X, Batty EM, Miller RR, Knox K, Young BC, Bowden R, Auton A, Votintseva A, Larner-Svensson H et al. 2014. Mobile elements drive recombination hotspots in the core genome of Staphylococcus aureus. Nat Commun, 5 (1), pp. 3956. | Show Abstract | Read more

Horizontal gene transfer is an important driver of bacterial evolution, but genetic exchange in the core genome of clonal species, including the major pathogen Staphylococcus aureus, is incompletely understood. Here we reveal widespread homologous recombination in S. aureus at the species level, in contrast to its near-complete absence between closely related strains. We discover a patchwork of hotspots and coldspots at fine scales falling against a backdrop of broad-scale trends in rate variation. Over megabases, homoplasy rates fluctuate 1.9-fold, peaking towards the origin-of-replication. Over kilobases, we find core recombination hotspots of up to 2.5-fold enrichment situated near fault lines in the genome associated with mobile elements. The strongest hotspots include regions flanking conjugative transposon ICE6013, the staphylococcal cassette chromosome (SCC) and genomic island νSaα. Mobile element-driven core genome transfer represents an opportunity for adaptation and challenges our understanding of the recombination landscape in predominantly clonal pathogens, with important implications for genotype-phenotype mapping.

Olupot-Olupot P, Engoru C, Thompson J, Nteziyaremye J, Chebet M, Ssenyondo T, Dambisya CM, Okuuny V, Wokulira R, Amorut D et al. 2014. Phase II trial of standard versus increased transfusion volume in Ugandan children with acute severe anemia. BMC Med, 12 (1), pp. 67. | Show Abstract | Read more

BACKGROUND: Severe anemia (SA, hemoglobin <6 g/dl) is a leading cause of pediatric hospital admission in Africa, with significant in-hospital mortality. The underlying etiology is often infectious, but specific pathogens are rarely identified. Guidelines developed to encourage rational blood use recommend a standard volume of whole blood (20 ml/kg) for transfusion, but this is commonly associated with a frequent need for repeat transfusion and poor outcome. Evidence is lacking on what hemoglobin threshold criteria for intervention and volume are associated with the optimal survival outcomes. METHODS: We evaluated the safety and efficacy of a higher volume of whole blood (30 ml/kg; Tx30: n = 78) against the standard volume (20 ml/kg; Tx20: n = 82) in Ugandan children (median age 36 months (interquartile range (IQR) 13 to 53)) for 24-hour anemia correction (hemoglobin >6 g/dl: primary outcome) and 28-day survival. RESULTS: Median admission hemoglobin was 4.2 g/dl (IQR 3.1 to 4.9). Initial volume received followed the randomization strategy in 155 (97%) patients. By 24-hours, 70 (90%) children in the Tx30 arm had corrected SA compared to 61 (74%) in the Tx20 arm; cause-specific hazard ratio = 1.54 (95% confidence interval 1.09 to 2.18, P = 0.01). From admission to day 28 there was a greater hemoglobin increase from enrollment in Tx30 (global P <0.0001). Serious adverse events included one non-fatal allergic reaction and one death in the Tx30 arm. There were six deaths in the Tx20 arm (P = 0.12); three deaths were adjudicated as possibly related to transfusion, but none secondary to volume overload. CONCLUSION: A higher initial transfusion volume prescribed at hospital admission was safe and resulted in an accelerated hematological recovery in Ugandan children with SA. Future testing in a large, pragmatic clinical trial to establish the effect on short and longer-term survival is warranted. TRIAL REGISTRATION: ClinicalTrials.Gov identifier: NCT01461590 registered 26 October 2011.

Gough EK, Moodie EEM, Prendergast AJ, Johnson SMA, Humphrey JH, Stoltzfus RJ, Walker AS, Trehan I, Gibb DM, Goto R et al. 2014. The impact of antibiotics on growth in children in low and middle income countries: systematic review and meta-analysis of randomised controlled trials. BMJ, 348 (apr15 6), pp. g2267. | Show Abstract | Read more

OBJECTIVES: To determine whether antibiotic treatment leads to improvements in growth in prepubertal children in low and middle income countries, to determine the magnitude of improvements in growth, and to identify moderators of this treatment effect. DESIGN: Systematic review and meta-analysis. DATA SOURCES: Medline, Embase, Scopus, the Cochrane central register of controlled trials, and Web of Science. STUDY SELECTION: Randomised controlled trials conducted in low or middle income countries in which an orally administered antibacterial agent was allocated by randomisation or minimisation and growth was measured as an outcome. Participants aged 1 month to 12 years were included. Control was placebo or non-antimicrobial intervention. RESULTS: Data were pooled from 10 randomised controlled trials representing 4316 children, across a variety of antibiotics, indications for treatment, treatment regimens, and countries. In random effects models, antibiotic use increased height by 0.04 cm/month (95% confidence interval 0.00 to 0.07) and weight by 23.8 g/month (95% confidence interval 4.3 to 43.3). After adjusting for age, effects on height were larger in younger populations and effects on weight were larger in African studies compared with other regions. CONCLUSION: Antibiotics have a growth promoting effect in prepubertal children in low and middle income countries. This effect was more pronounced for ponderal than for linear growth. The antibiotic growth promoting effect may be mediated by treatment of clinical or subclinical infections or possibly by modulation of the intestinal microbiota. Better definition of the mechanisms underlying this effect will be important to inform optimal and safe approaches to achieving healthy growth in vulnerable populations.

Moore CE, Paul J, Foster D, Mahar SA, Griffiths D, Knox K, Peto TE, Walker AS, Crook DW, Oxford Invasive Pneumococcal Surveillance Group. 2014. Reduction of invasive pneumococcal disease 3 years after the introduction of the 13-valent conjugate vaccine in the Oxfordshire region of England. J Infect Dis, 210 (7), pp. 1001-1011. | Show Abstract | Read more

BACKGROUND: The 7-valent pneumococcal conjugate (PCV7) vaccine's impact on invasive pneumococcal disease (IPD) is well described, but few reports exist on the additional impact of the 13-valent vaccine (PCV13). METHODS: We calculated the IPD incidence across all ages in a surveillance project following implementation of PCV7 (in September 2006) and PCV13 (in April 2010) in children aged <2 years (11 hospitals; 4935 cases). RESULTS: The overall incidence decreased from 10 cases/100 000 persons per year in 1996-1997 to 8 cases/100 000 persons per year in 2007-2008 and 7 cases/100 000 in 2012-2013. Declines were greater in children aged <2 years (from 37 cases/100 000 in 1996-1997 to 29 and 14 cases/100 000 in 2007-2008 and 2012-2013, respectively). The incidence of IPD due to PCV7 serotypes decreased in all ages after PCV7 introduction (P < .001), whereas the incidence of IPD due to the additional 6 serotypes in PCV13 and to nonvaccine types (NVTs) increased in children aged ≥2 years (P < .001 for both comparisons). The incidence of IPD due to the 6 additional serotypes in PCV13 declined significantly after PCV13 introduction in all ages (P ≤ .01), and the incidence of IPD due to NVTs declined significantly in children aged ≥2 years (P = .003). In 2011-2013, the overall incidences of IPD due to PCV7 serotypes, the 6 additional serotypes in PCV13, and NVTs were 0.3, 2.8, and 4.4 cases/100 000; the incidences among children aged <2 years were 0.9, 2.4, and 10.8 cases/100 000, respectively. CONCLUSIONS: The annual incidence of IPD due to vaccine serotypes (1-3 cases/100 000) among children aged <2 years and nontarget groups demonstrates the success of PCV7 and PCV13. A substantially higher incidence of IPD due to NVTs indicates the importance of ongoing surveillance and extension of vaccine polyvalency.

Walker TM, Lalor MK, Broda A, Ortega LS, Morgan M, Parker L, Churchill S, Bennett K, Golubchik T, Giess AP et al. 2014. Assessment of Mycobacterium tuberculosis transmission in Oxfordshire, UK, 2007-12, with whole pathogen genome sequences: an observational study. Lancet Respir Med, 2 (4), pp. 285-292. | Show Abstract | Read more

BACKGROUND: Patients born outside the UK have contributed to a 20% rise in the UK's tuberculosis incidence since 2000, but their effect on domestic transmission is not known. Here we use whole-genome sequencing to investigate the epidemiology of tuberculosis transmission in an unselected population over 6 years. METHODS: We identified all residents with Oxfordshire postcodes with a Mycobacterium tuberculosis culture or a clinical diagnosis of tuberculosis between Jan 1, 2007, and Dec 31, 2012, using local databases and checking against the national Enhanced Tuberculosis Surveillance database. We used Illumina technology to sequence all available M tuberculosis cultures from identified cases. Sequences were clustered by genetic relatedness and compared retrospectively with contact investigations. The first patient diagnosed in each cluster was defined as the index case, with links to subsequent cases assigned first by use of any epidemiological linkage, then by genetic distance, and then by timing of diagnosis. FINDINGS: Although we identified 384 patients with a diagnosis of tuberculosis, country of birth was known for 380 and we sequenced isolates from 247 of 269 cases with culture-confirmed disease. 39 cases were genomically linked within 13 clusters, implying 26 local transmission events. Only 11 of 26 possible transmissions had been previously identified through contact tracing. Of seven genomically confirmed household clusters, five contained additional genomic links to epidemiologically unidentified non-household members. 255 (67%) patients were born in a country with high tuberculosis incidence, conferring a local incidence of 109 cases per 100,000 population per year in Oxfordshire, compared with 3·5 cases per 100,000 per year for those born in low-incidence countries. However, patients born in the low-incidence countries, predominantly UK, were more likely to have pulmonary disease (adjusted odds ratio 1·8 [95% CI 1·2-2·9]; p=0·009), social risk factors (4·4 [2·0-9·4]; p<0·0001), and be part of a local transmission cluster (4·8 [1·6-14·8]; p=0·006). INTERPRETATION: Although inward migration has contributed to the overall tuberculosis incidence, our findings suggest that most patients born in high-incidence countries reactivate latent infection acquired abroad and are not involved in local onward transmission. Systematic screening of new entrants could further improve tuberculosis control, but it is important that health care remains accessible to all individuals, especially high-risk groups, if tuberculosis control is not to be jeopardised. FUNDING: UK Clinical Research Collaboration (Wellcome Trust, Medical Research Council, National Institute for Health Research [NIHR]), and NIHR Oxford Biomedical Research Centre.

Hasse B, Walker AS, Fehr J, Furrer H, Hoffmann M, Battegay M, Calmy A, Fellay J, Di Benedetto C, Weber R et al. 2014. Co-trimoxazole prophylaxis is associated with reduced risk of incident tuberculosis in participants in the Swiss HIV Cohort Study. Antimicrob Agents Chemother, 58 (4), pp. 2363-2368. | Show Abstract | Read more

Co-trimoxazole reduces mortality in HIV-infected adults with tuberculosis (TB), and in vitro data suggest potential antimycobacterial activity of co-trimoxazole. We aimed to evaluate whether prophylaxis with co-trimoxazole is associated with a decreased risk of incident TB in Swiss HIV Cohort Study (SHCS) participants. We determined the incidence of TB per 1,000 person-years from January 1992 to December 2012. Rates were analyzed separately in participants with current or no previous antiretroviral treatment (ART) using Poisson regression adjusted for CD4 cell count, sex, region of origin, injection drug use, and age. A total of 13,431 cohort participants contributed 107,549 person-years of follow-up: 182 patients had incident TB-132 (73%) before and 50 (27%) after ART initiation. The multivariable incidence rate ratios for cumulative co-trimoxazole exposure per year for persons with no previous ART and current ART were 0.70 (95% confidence interval [CI], 0.55 to 0.89) and 0.87 (95% CI, 0.74 to 1.0), respectively. Co-trimoxazole may prevent the development of TB among HIV-positive persons, especially among those with no previous ART.

Kityo C, Gibb DM, Gilks CF, Goodall RL, Mambule I, Kaleebu P, Pillay D, Kasirye R, Mugyenyi P, Walker AS et al. 2014. High level of viral suppression and low switch rate to second-line antiretroviral therapy among HIV-infected adult patients followed over five years: retrospective analysis of the DART trial. PLoS One, 9 (3), pp. e90772. | Show Abstract | Read more

UNLABELLED: In contrast to resource-rich countries, most HIV-infected patients in resource-limited countries receive treatment without virological monitoring. There are few long-term data, in this setting, on rates of viral suppression or switch to second-line antiretroviral therapy. The DART trial compared clinically driven monitoring (CDM) versus routine laboratory (CD4/haematology/biochemistry) and clinical monitoring (LCM) in HIV-infected adults initiating therapy. There was no virological monitoring in either study group during follow-up, but viral load was measured in Ugandan participants at trial closure. Two thousand three hundred and seventeen (2317) participants from this country initiated antiretroviral therapy with zidovudine/lamivudine plus tenofovir (n = 1717), abacavir (n = 300), or nevirapine (n = 300). Of 1896 (81.8%) participants who were alive and in follow-up at trial closure (median 5.1 years after therapy initiation), 1507 (79.5%) were on first-line and 389 (20.5%) on second-line antiretroviral therapy. The overall switch rate after the first year was 5.6 per 100 person-years; the rate was substantially higher in participants with low baseline CD4 counts (<50 cells/mm3). Among 1207 (80.1%) first-line participants with viral load measured, HIV RNA was <400 copies/ml in 963 (79.8%), 400-999 copies/ml in 37 (3.1%), 1,000-9,999 copies/ml in 110 (9.1%), and ≥10,000 copies/ml in 97 (8.0%). The proportion with HIV RNA <400 copies/ml was slightly lower (difference 7.1%, 95% CI 2.5 to 11.5%) in CDM (76.3%) than in LCM (83.4%). Among 252 (64.8%) second-line participants with viral load measured (median 2.3 years after switch), HIV RNA was <400 copies/ml in 226 (89.7%), with no difference between monitoring strategies. Low switch rates and high, sustained levels of viral suppression are achievable without viral load or CD4 count monitoring in the context of high-quality clinical care. TRIAL REGISTRATION: ISRCTN13968779.

Votintseva AA, Fung R, Miller RR, Knox K, Godwin H, Wyllie DH, Bowden R, Crook DW, Walker AS. 2014. Prevalence of Staphylococcus aureus protein A (spa) mutants in the community and hospitals in Oxfordshire. BMC Microbiol, 14 (1), pp. 63. | Show Abstract | Read more

BACKGROUND: Staphylococcal protein A (spa) is an important virulence factor which enables Staphylococcus aureus to evade host immune responses. Genotypes known as "spa-types", based on highly variable Xr region sequences of the spa-gene, are frequently used to classify strains. A weakness of current spa-typing primers is that rearrangements in the IgG-binding region of the gene cause 1-2% of strains to be designated as "non-typeable". RESULTS: We developed an improved primer which enabled sequencing of all strains, containing any type of genetic rearrangement, in a large study among community carriers and hospital inpatients in Oxfordshire, UK (6110 isolates). We identified eight novel spa-gene variants, plus one previously described. Three of these rearrangements would be designated "non-typeable" using current spa-typing methods; they occurred in 1.8% (72/3905) asymptomatically carried and 0.6% (14/2205) inpatient S. aureus strains. Some individuals were simultaneously colonized by both formerly non-typeable and typeable strains; previously such patients would have been identified as carrying only currently typeable strains, underestimating mixed carriage prevalence and diversity. Formerly non-typeable strains were found in more spa-types associated with multilocus sequence type ST398 (35%), common among livestock, compared to other groups with any non-typeable strains (1-4%), suggesting particular spa-types may have been under-represented in previous human studies. CONCLUSIONS: This improved method allows us to spa-type previously non-typeable strains with rearrangements in the spa-gene and to resolve cases of mixed colonization with deletions in one or more strains, thus accounting for hidden diversity of S. aureus in both community and hospital environments.

Price JR, Golubchik T, Cole K, Wilson DJ, Crook DW, Thwaites GE, Bowden R, Walker AS, Peto TEA, Paul J, Llewelyn MJ. 2014. Whole-genome sequencing shows that patient-to-patient transmission rarely accounts for acquisition of Staphylococcus aureus in an intensive care unit. Clin Infect Dis, 58 (5), pp. 609-618. | Show Abstract | Read more

BACKGROUND:  Strategies to prevent Staphylococcus aureus infection in hospitals focus on patient-to-patient transmission. We used whole-genome sequencing to investigate the role of colonized patients as the source of new S. aureus acquisitions, and the reliability of identifying patient-to-patient transmission using the conventional approach of spa typing and overlapping patient stay. METHODS: Over 14 months, all unselected patients admitted to an adult intensive care unit (ICU) were serially screened for S. aureus. All available isolates (n = 275) were spa typed and underwent whole-genome sequencing to investigate their relatedness at high resolution. RESULTS: Staphylococcus aureus was carried by 185 of 1109 patients sampled within 24 hours of ICU admission (16.7%); 59 (5.3%) patients carried methicillin-resistant S. aureus (MRSA). Forty-four S. aureus (22 MRSA) acquisitions while on ICU were detected. Isolates were available for genetic analysis from 37 acquisitions. Whole-genome sequencing indicated that 7 of these 37 (18.9%) were transmissions from other colonized patients. Conventional methods (spa typing combined with overlapping patient stay) falsely identified 3 patient-to-patient transmissions (all MRSA) and failed to detect 2 acquisitions and 4 transmissions (2 MRSA). CONCLUSIONS: Only a minority of S. aureus acquisitions can be explained by patient-to-patient transmission. Whole-genome sequencing provides the resolution to disprove transmission events indicated by conventional methods and also to reveal otherwise unsuspected transmission events. Whole-genome sequencing should replace conventional methods for detection of nosocomial S. aureus transmission.

Ewings FM, Ford D, Walker AS, Carpenter J, Copas A. 2014. Optimal CD4 count for initiating HIV treatment: impact of CD4 observation frequency and grace periods, and performance of dynamic marginal structural models. Epidemiology, 25 (2), pp. 194-202. | Show Abstract | Read more

BACKGROUND: In HIV infection, dynamic marginal structural models have estimated the optimal CD4 for treatment initiation to minimize AIDS/death. The impact of CD4 observation frequency and grace periods (permitted delay to initiation) on the optimal regimen has not been investigated nor has the performance of dynamic marginal structural models in moderately sized data sets-two issues that are relevant to many applications. METHODS: To determine optimal regimens, we simulated 31,000,000 HIV-infected persons randomized at CD4 500-550 cells/mm to regimens "initiate treatment within a grace period following observed CD4 first <x cells/mm," x = 200, 210, …, 500. Natural history and treatment response were simulated using previous model estimates from CASCADE data. Optimal treatment regimens for the observation frequencies and grace periods were defined by highest 10-year AIDS-free survival. To evaluate the performance of dynamic marginal structural models, we simulated 1000 observational studies (n = 3,000) with CD4-dependent treatment initiation. RESULTS: Decreasing the frequency of CD4 measurements from monthly to every 3, 6, and 12 months increased the optimal regimen from a CD4 level of 350 (10-year AIDS-free survival, 0.8657) to 410 (0.8650), 460 (0.8634), and 490 (0.8564), respectively. Under a regimen defined by x = 350 with annual CD4s, 10-year AIDS-free survival dropped to 0.8304. Extending the grace period from 1 to 3 or 6 months, with 3-monthly CD4s, maintained the optimal regimen at 410 for 3 months and increased it to 460 for 6 months. In observational studies with 3-monthly CD4s, the mean (SE) estimated optimal regimen was 402 (76), 424 (66), and 430 (63) with 1-, 3-, and 6-month grace periods; 24%, 15%, and 14% of estimated optimal regimens resulted in >0.5% lower AIDS-free survival compared with the true optimal regimen. CONCLUSIONS: The optimal regimen is strongly influenced by CD4 frequency and less by grace period length. Dynamic marginal structural models lack precision at moderate sample sizes.

Votintseva AA, Miller RR, Fung R, Knox K, Godwin H, Peto TEA, Crook DW, Bowden R, Walker AS. 2014. Multiple-strain colonization in nasal carriers of Staphylococcus aureus. J Clin Microbiol, 52 (4), pp. 1192-1200. | Show Abstract | Read more

Staphylococcus aureus is a commensal that can also cause invasive infection. Reports suggest that nasal cocolonization occurs rarely, but the resources required to sequence multiple colonies have precluded its large-scale investigation. A staged protocol was developed to maximize detection of mixed-spa-type colonization while minimizing laboratory resources using 3,197 S. aureus-positive samples from a longitudinal study of healthy individuals in Oxfordshire, United Kingdom. Initial typing of pooled material from each sample identified a single unambiguous strain in 89.6% of samples. Twelve single-colony isolates were typed from samples producing ambiguous initial results. All samples could be resolved into one or more spa types using the protocol. Cocolonization point prevalence was 3.4 to 5.8% over 24 months of follow-up in 360 recruitment-positives. However, 18% were cocolonized at least once, most only transiently. Cocolonizing spa types were completely unrelated in 56% of samples. Of 272 recruitment-positives returning ≥12 swabs, 166 (61%) carried S. aureus continuously but only 106 (39%) carried the same single spa type without any cocolonization; 31 (11%) switched spa type and 29 (11%) had transient cocarriage. S. aureus colonization is dynamic even in long-term carriers. New unrelated cocolonizing strains could increase invasive disease risk, and ongoing within-host evolution could increase invasive potential, possibilities that future studies should explore.

Gordon NC, Price JR, Cole K, Everitt R, Morgan M, Finney J, Kearns AM, Pichon B, Young B, Wilson DJ et al. 2014. Prediction of Staphylococcus aureus antimicrobial resistance by whole-genome sequencing. J Clin Microbiol, 52 (4), pp. 1182-1191. | Show Abstract | Read more

Whole-genome sequencing (WGS) could potentially provide a single platform for extracting all the information required to predict an organism's phenotype. However, its ability to provide accurate predictions has not yet been demonstrated in large independent studies of specific organisms. In this study, we aimed to develop a genotypic prediction method for antimicrobial susceptibilities. The whole genomes of 501 unrelated Staphylococcus aureus isolates were sequenced, and the assembled genomes were interrogated using BLASTn for a panel of known resistance determinants (chromosomal mutations and genes carried on plasmids). Results were compared with phenotypic susceptibility testing for 12 commonly used antimicrobial agents (penicillin, methicillin, erythromycin, clindamycin, tetracycline, ciprofloxacin, vancomycin, trimethoprim, gentamicin, fusidic acid, rifampin, and mupirocin) performed by the routine clinical laboratory. We investigated discrepancies by repeat susceptibility testing and manual inspection of the sequences and used this information to optimize the resistance determinant panel and BLASTn algorithm. We then tested performance of the optimized tool in an independent validation set of 491 unrelated isolates, with phenotypic results obtained in duplicate by automated broth dilution (BD Phoenix) and disc diffusion. In the validation set, the overall sensitivity and specificity of the genomic prediction method were 0.97 (95% confidence interval [95% CI], 0.95 to 0.98) and 0.99 (95% CI, 0.99 to 1), respectively, compared to standard susceptibility testing methods. The very major error rate was 0.5%, and the major error rate was 0.7%. WGS was as sensitive and specific as routine antimicrobial susceptibility testing methods. WGS is a promising alternative to culture methods for resistance prediction in S. aureus and ultimately other major bacterial pathogens.

Eyre DW, Wilcox MH, Walker AS. 2014. Diverse sources of C. difficile infection. N Engl J Med, 370 (2), pp. 183-184. | Read more

Eyre DW, Wilcox MH, Walker AS. 2014. Diverse sources of C. difficile infection. The New England journal of medicine, 370 (2), pp. 183-184.

Bwakura-Dangarembizi M, Kendall L, Bakeera-Kitaka S, Nahirya-Ntege P, Keishanyu R, Nathoo K, Spyer MJ, Kekitiinwa A, Lutaakome J, Mhute T et al. 2014. A randomized trial of prolonged co-trimoxazole in HIV-infected children in Africa. N Engl J Med, 370 (1), pp. 41-53. | Show Abstract | Read more

BACKGROUND: Co-trimoxazole (fixed-dose trimethoprim-sulfamethoxazole) prophylaxis administered before antiretroviral therapy (ART) reduces morbidity in children infected with the human immunodeficiency virus (HIV). We investigated whether children and adolescents receiving long-term ART in sub-Saharan Africa could discontinue co-trimoxazole. METHODS: We conducted a randomized, noninferiority trial of stopping versus continuing daily open-label co-trimoxazole in children and adolescents in Uganda and Zimbabwe. Eligible participants were older than 3 years of age, had been receiving ART for more than 96 weeks, were using insecticide-treated bed nets (in malaria-endemic areas), and had not had Pneumocystis jirovecii pneumonia. Coprimary end points were hospitalization or death and adverse events of grade 3 or 4. RESULTS: A total of 758 participants were randomly assigned to stop or continue co-trimoxazole (382 and 376 participants, respectively), after receiving ART for a median of 2.1 years (interquartile range, 1.8 to 2.3). The median age was 7.9 years (interquartile range, 4.6 to 11.1), and the median CD4 T-cell percentage was 33% (interquartile range, 26 to 39). Participants who stopped co-trimoxazole had higher rates of hospitalization or death than those who continued (72 participants [19%] vs. 48 [13%]; hazard ratio, 1.64; 95% confidence interval [CI], 1.14 to 2.37; P = 0.007; noninferiority not shown). There was no evidence of variation across ages (P=0.93 for interaction). A total of 2 participants in the prophylaxis-stopped group (1%) died, as did 3 in the prophylaxis-continued group (1%). Most hospitalizations in the prophylaxis-stopped group were for malaria (49 events, vs. 21 in the prophylaxis-continued group) or infections other than malaria (53 vs. 25), particularly pneumonia, sepsis, and meningitis. Rates of adverse events of grade 3 or 4 were similar in the two groups (hazard ratio, 1.20; 95% CI, 0.83 to 1.72; P=0.33), but more grade 4 adverse events occurred in the prophylaxis-stopped group (hazard ratio, 2.04; 95% CI, 0.99 to 4.22; P=0.05), with anemia accounting for the largest number of events (12, vs. 2 with continued prophylaxis). CONCLUSIONS: Continuing co-trimoxazole prophylaxis after 96 weeks of ART was beneficial, as compared with stopping prophylaxis, with fewer hospitalizations for both malaria and infection not related to malaria. (Funded by the United Kingdom Medical Research Council and others; ARROW Current Controlled Trials number, ISRCTN24791884.).

Musiime V, Cook A, Kayiwa J, Zangata D, Nansubuga C, Arach B, Kenny J, Wavamunno P, Komunyena J, Kabamba D et al. 2014. Anthropometric measurements and lipid profiles to detect early lipodystrophy in antiretroviral therapy experienced HIV-infected children in the CHAPAS-3 trial. Antivir Ther, 19 (3), pp. 269-276. | Show Abstract | Read more

BACKGROUND: Few studies have investigated objective markers of lipodystrophy in African children. We compared body circumferences, skin-fold thickness (SFT) and lipids in antiretroviral therapy (ART)-naive and stavudine (d4T)-exposed children with HIV-uninfected controls. METHODS: In the CHAPAS-3 trial, HIV-infected children (ART-naive or on d4T for ≥2 years without clinical lipodystrophy) were randomized to d4T, abacavir or zidovudine with lamivudine (3TC) plus a non-nucleoside reverse transcriptase inhibitor. Mid-upper-arm circumference (MUAC) and calf circumference (CC), SFT (biceps, triceps, sub-scapular and supra-iliac) and fasting lipids (total cholesterol [TC], low-density lipoprotein [LDL], high-density lipoprotein [HDL] and triglycerides [TRIG]) were measured at randomization in all HIV-infected children, and in HIV-uninfected controls. Age- and sex-adjusted z-scores of MUAC, CC, SFT and the sum of SFT (SSF) using Dutch reference data were compared across groups using linear regression. RESULTS: Of 496 children, 49% were male, 299 (median age 2.5 years [IQR 1.5-4.0]) were ART-naive, 109 (median age 6 years [IQR 5.5-7.0]) were ART-experienced and 88 (median age 2.2 years [IQR 1.5-3.0]) were control children. Overall, 100% and 95% of ART-experienced children had been on d4T plus 3TC and nevirapine, respectively, for a median 3.5 years (IQR 2.6-4.2). Mean (sd) weight-for-age z-scores and MUAC z-scores were -1.51 (1.29) versus -0.90 (0.88) versus -0.33 (1.15) and -1.56 (1.25) versus -1.24 (0.97) versus -0.65 (1.06) in ART-naive versus -experienced versus controls, respectively (all P<0.02). The mean (sd) of SSF was lower in the ART-experienced (-0.78 [1.28]) than in the ART-naive (-0.32 [1.09]; P<0.0001) children and controls (-0.29 [0.88]; P<0.002). ART-experienced children had higher mean fasting TC, LDL and HDL but lower TRIG compared to ART-naive children (P-values <0.0001), and higher TC and HDL but lower TRIG compared to controls (P-values <0.01). CONCLUSIONS: In ART-experienced children on d4T-containing regimens, we observed lower SFT and higher TC and LDL values compared to ART-naive children and HIV-uninfected controls.

Dingle KE, Elliott B, Robinson E, Griffiths D, Eyre DW, Stoesser N, Vaughan A, Golubchik T, Fawley WN, Wilcox MH et al. 2014. Evolutionary history of the Clostridium difficile pathogenicity locus. Genome Biol Evol, 6 (1), pp. 36-52. | Show Abstract | Read more

The symptoms of Clostridium difficile infection are caused by toxins expressed from its 19 kb pathogenicity locus (PaLoc). Stable integration of the PaLoc is suggested by its single chromosomal location and the clade specificity of its different genetic variants. However, the PaLoc is variably present, even among closely related strains, and thus resembles a mobile genetic element. Our aim was to explain these apparently conflicting observations by reconstructing the evolutionary history of the PaLoc. Phylogenetic analyses and annotation of the regions spanning the PaLoc were performed using C. difficile population-representative genomes chosen from a collection of 1,693 toxigenic (PaLoc present) and nontoxigenic (PaLoc absent) isolates. Comparison of the core genome and PaLoc phylogenies demonstrated an eventful evolutionary history, with distinct PaLoc variants acquired clade specifically after divergence. In particular, our data suggest a relatively recent PaLoc acquisition in clade 4. Exchanges and losses of the PaLoc DNA have also occurred, via long homologous recombination events involving flanking chromosomal sequences. The most recent loss event occurred ∼30 years ago within a clade 1 genotype. The genetic organization of the clade 3 PaLoc was unique in containing a stably integrated novel transposon (designated Tn6218), variants of which were found at multiple chromosomal locations. Tn6218 elements were Tn916-related but nonconjugative and occasionally contained genes conferring resistance to clinically relevant antibiotics. The evolutionary histories of two contrasting but clinically important genetic elements were thus characterized: the PaLoc, mobilized rarely via homologous recombination, and Tn6218, mobilized frequently through transposition.

Eyre DW, Griffiths D, Vaughan A, Golubchik T, Acharya M, O'Connor L, Crook DW, Walker AS, Peto TEA. 2013. Asymptomatic Clostridium difficile colonisation and onward transmission. PLoS One, 8 (11), pp. e78445. | Show Abstract | Read more

INTRODUCTION: Combined genotyping/whole genome sequencing and epidemiological data suggest that in endemic settings only a minority of Clostridium difficile infection, CDI, is acquired from other cases. Asymptomatic patients are a potential source for many unexplained cases. METHODS: We prospectively screened a cohort of medical inpatients in a UK teaching hospital for asymptomatic C. difficile carriage using stool culture. Electronic and questionnaire data were used to determine risk factors for asymptomatic carriage by logistic regression. Carriage isolates were compared with all hospital/community CDI cases from the same geographic region, from 12 months before the study to 3 months after, using whole genome sequencing and hospital admission data, assessing particularly for evidence of onward transmission from asymptomatic cases. RESULTS: Of 227 participants recruited, 132 provided ≥1 stool samples for testing. 18 participants were culture-positive for C. difficile, 14/132(11%) on their first sample. Independent risk factors for asymptomatic carriage were patient reported loose/frequent stool (but not meeting CDI criteria of ≥3 unformed stools in 24 hours), previous overnight hospital stay within 6 months, and steroid/immunosuppressant medication in the last 6 months (all p≤0.02). Surprisingly antibiotic exposure in the last 6 months was independently associated with decreased risk of carriage (p = 0.005). The same risk factors were identified excluding participants reporting frequent/loose stool. 13/18(72%) asymptomatically colonised patients carried toxigenic strains from common disease-causing lineages found in cases. Several plausible transmission events to asymptomatic carriers were identified, but in this relatively small study no clear evidence of onward transmission from an asymptomatic case was seen. CONCLUSIONS: Transmission events from any one asymptomatic carrier are likely to be relatively rare, but as asymptomatic carriage is common, it may still be an important source of CDI, which could be quantified in larger studies. Risk factors established for asymptomatic carriage may help identify patients for inclusion in such studies.

Eyre DW, Babakhani F, Griffiths D, Seddon J, Del Ojo Elias C, Gorbach SL, Peto TEA, Crook DW, Walker AS. 2014. Whole-genome sequencing demonstrates that fidaxomicin is superior to vancomycin for preventing reinfection and relapse of infection with Clostridium difficile. J Infect Dis, 209 (9), pp. 1446-1451. | Show Abstract | Read more

Whole-genome sequencing was used to determine whether the reductions in recurrence of Clostridium difficile infection observed with fidaxomicin in pivotal phase 3 trials occurred by preventing relapse of the same infection, by preventing reinfection with a new strain, or by preventing both outcomes. Paired isolates of C. difficile were available from 93 of 199 participants with recurrences (28 were treated with fidaxomicin, and 65 were treated with vancomycin). Given C. difficile evolutionary rates, paired samples ≤2 single-nucleotide variants (SNVs) apart were considered relapses, paired samples >10 SNVs apart were considered reinfection, and those 3-10 SNVs apart (or without whole-genome sequences) were considered indeterminate in a competing risks survival analysis. Fidaxomicin reduced the risk of both relapse (competing risks hazard ratio [HR], 0.40 [95% confidence interval {CI}, .25-.66]; P = .0003) and reinfection (competing risks HR, 0.33 [95% CI, 0.11-1.01]; P = .05).

Eyre DW, Walker AS. 2013. Clostridium difficile surveillance: harnessing new technologies to control transmission. Expert Rev Anti Infect Ther, 11 (11), pp. 1193-1205. | Show Abstract | Read more

Clostridium difficile surveillance allows outbreaks of cases clustered in time and space to be identified and further transmission prevented. Traditionally, manual detection of groups of cases diagnosed in the same ward or hospital, often followed by retrospective reference laboratory genotyping, has been used to identify outbreaks. However, integrated healthcare databases offer the prospect of automated real-time outbreak detection based on statistically robust methods, and accounting for contacts between cases, including those distant to the ward of diagnosis. Complementary to this, rapid benchtop whole genome sequencing, and other highly discriminatory genotyping, has the potential to distinguish which cases are part of an outbreak with high precision and in clinically relevant timescales. These new technologies are likely to shape future surveillance.

Fillekes Q, Muro EP, Chunda C, Aitken S, Kisanga ER, Kankasa C, Thomason MJ, Gibb DM, Walker AS, Burger DM. 2013. Effect of 7 days of phenytoin on the pharmacokinetics of and the development of resistance to single-dose nevirapine for perinatal HIV prevention: a randomized pilot trial. J Antimicrob Chemother, 68 (11), pp. 2609-2615. | Show Abstract | Read more

OBJECTIVES: To confirm whether 7 days of phenytoin, an enzyme inducer, would decrease the elimination half-life of single-dose nevirapine and to investigate its effect on the development of nevirapine resistance in pregnant, HIV-infected women. METHODS: In a pharmacokinetic pilot trial (NCT01187719), HIV-infected, antiretroviral (ARV)-naive pregnant women ≥18 years old from Zambia and Tanzania and with CD4 cell counts >350 cells/mm(3) were randomized 1 : 1 to a control (zidovudine pre-delivery, single-dose nevirapine/zidovudine/lamivudine at delivery and zidovudine/lamivudine for 7 days post-delivery) or an intervention (control plus 184 mg of phenytoin once daily for 7 days post-delivery) group. Primary endpoints were the pharmacokinetics of and resistance to nevirapine. RESULTS: Thirty-five and 37 women were allocated to the control and intervention groups, with median (IQR) ages of 27 (23-31) and 27 (23-33) years, respectively. Twenty-three and 23 women had detectable nevirapine levels at delivery and subsequent samples in the control and the intervention groups, respectively. Geometric mean (GM) (95% CI) plasma levels of nevirapine at delivery were 1.02 (0.58-1.78) mg/L and 1.14 (0.70-1.86) mg/L in the control and intervention groups, respectively (P = 0.76). One week after delivery, 0/23 (0%) and 15/22 (68%) control and intervention mothers, respectively, had undetectable levels of nevirapine (<0.05 mg/L; P<0.001). One week later, the figures were 10/21 (48%) and 18/19 (95%) mothers, respectively (P = 0.002). The GM (95% CI) half-life of nevirapine was 63.2 (52.8-75.7) versus 25.5 (21.6-30.1) h in the control group versus the intervention group (P < 0.001). New nevirapine mutations were found in 0/20 (0%) intervention-group mothers versus 1/21 (5%) control-group mothers. Overall, there was no difference in adverse events reported between the control and intervention arms (P > 0.28). CONCLUSIONS: Adding 7 days of an enzyme inducer to single-dose nevirapine to prevent mother-to-child transmission of HIV significantly reduced subtherapeutic nevirapine levels by shortening the half-life of nevirapine. As prolonged subtherapeutic nevirapine dosage leads to the emergence of resistance, single-dose nevirapine could be used with phenytoin as an alternative if other ARVs were unavailable.

Klein N, Sefe D, Mosconi I, Zanchetta M, Castro H, Jacobsen M, Jones H, Bernardi S, Pillay D, Giaquinto C et al. 2013. The immunological and virological consequences of planned treatment interruptions in children with HIV infection. PLoS One, 8 (10), pp. e76582. | Show Abstract | Read more

OBJECTIVES: To evaluate the immunological and viral consequences of planned treatment interruptions (PTI) in children with HIV. DESIGN: This was an immunological and virological sub-study of the Paediatric European Network for Treatment of AIDS (PENTA) 11 trial, which compared CD4-guided PTI of antiretroviral therapy (ART) with continuous therapy (CT) in children. METHODS: HIV-1 RNA and lymphocyte subsets, including CD4 and CD8 cells, were quantified on fresh samples collected during the study; CD45RA, CD45RO and CD31 subpopulations were evaluated in some centres. For 36 (18 PTI, 18 CT) children, immunophenotyping was performed and cell-associated HIV-1 DNA analysed on stored samples to 48 weeks. RESULTS: In the PTI group, CD4 cell count fell rapidly in the first 12 weeks off ART, with decreases in both naïve and memory cells. However, the proportion of CD4 cells expressing CD45RA and CD45RO remained constant in both groups. The increase in CD8 cells in the first 12 weeks off ART in the PTI group was predominantly due to increases in RO-expressing cells. PTI was associated with a rapid and sustained increase in CD4 cells expressing Ki67 and HLA-DR, and increased levels of HIV-1 DNA. CONCLUSIONS: PTI in children is associated with rapid changes in CD4 and CD8 cells, likely due to increased cell turnover and immune activation. However, children off treatment may be able to maintain stable levels of naïve CD4 cells, at least in proportion to the memory cell pool, which may in part explain the observed excellent CD4 cell recovery with re-introduction of ART.

Eyre DW, Fawley WN, Best EL, Griffiths D, Stoesser NE, Crook DW, Peto TEA, Walker AS, Wilcox MH. 2013. Comparison of multilocus variable-number tandem-repeat analysis and whole-genome sequencing for investigation of Clostridium difficile transmission. J Clin Microbiol, 51 (12), pp. 4141-4149. | Show Abstract | Read more

No study to date has compared multilocus variable-number tandem-repeat analysis (MLVA) and whole-genome sequencing (WGS) in an investigation of the transmission of Clostridium difficile infection. Isolates from 61 adults with ongoing and/or recurrent C. difficile infections and 17 asymptomatic carriage episodes in children (201 samples), as well as from 61 suspected outbreaks affecting 2 to 41 patients in 31 hospitals in the United Kingdom (300 samples), underwent 7-locus MLVA and WGS in parallel. When the first and last samples from the same individual taken for a median (interquartile range [IQR]) of 63 days (43 to 105 days) apart were compared, the estimated rates of the evolution of single nucleotide variants (SNVs), summed tandem-repeat differences (STRDs), and locus variants (LVs) were 0.79 (95% confidence interval [CI], 0.00 to 1.75), 1.63 (95% CI, 0.00 to 3.59), and 1.21 (95% CI, 0.00 to 2.67)/called genome/year, respectively. Differences of >2 SNVs and >10 STRDs have been used to exclude direct case-to-case transmission. With the first serial sample per individual being used to assess discriminatory power, across all pairs of samples sharing a PCR ribotype, 192/283 (68%) differed by >10 STRDs and 217/283 (77%) by >2 SNVs. Among all pairs of cases from the same suspected outbreak, 1,190/1,488 (80%) pairs had concordant results using >2 SNVs and >10 STRDs to exclude transmission. For the discordant pairs, 229 (15%) had ≥2 SNVs but ≤10 STRDs, and 69 (5%) had ≤2 SNVs but ≥10 STRDs. Discordant pairs had higher numbers of LVs than concordant pairs, supporting the more diverse measure in each type of discordant pair. Conclusions on whether the potential outbreaks were confirmed were concordant in 58/61 (95%) investigations. Overall findings using MLVA and WGS were very similar despite the fact that they analyzed different parts of the bacterial genome. With improvements in WGS technology, it is likely that MLVA locus data will be available from WGS in the near future.

Parikh SM, Obuku EA, Walker SA, Semeere AS, Auerbach BJ, Hakim JG, Mayanja-Kizza H, Mugyenyi PN, Salata RA, Kityo CM, DART Trial Team. 2013. Clinical differences between younger and older adults with HIV/AIDS starting antiretroviral therapy in Uganda and Zimbabwe: a secondary analysis of the DART trial. PLoS One, 8 (10), pp. e76158. | Show Abstract | Read more

OBJECTIVE: Clinical and immunological data about HIV in older adults from low and middle income countries is scarce. We aimed to describe differences between younger and older adults with HIV starting antiretroviral therapy in two low-income African countries. METHODS: SETTING: HIV clinics in Uganda and Zimbabwe. DESIGN: Secondary exploratory cross-sectional analysis of the DART randomized controlled trial. OUTCOME MEASURES: Clinical and laboratory characteristics were compared between adults aged 18-49 years (younger) and ≥ 50 years (older), using two exploratory multivariable logistic regression models, one with HIV viral load (measured in a subset pre-ART) and one without. RESULTS: A total of 3316 eligible participants enrolled in DART were available for analysis; 219 (7%) were ≥ 50 years and 1160 (35%) were male. Across the two adjusted regression models, older adults had significantly higher systolic blood pressure, lower creatinine clearance and were consistently less likely to be females compared to younger adults with HIV. Paradoxically, the models separately suggested that older adults had statistically significant (but not clinically important) higher CD4+ cell counts and higher plasma HIV-1 viral copies at initiation. Crude associations between older age and higher baseline hemoglobin, body mass index, diastolic blood pressure and lower WHO clinical stage were not sustained in the adjusted analysis. CONCLUSIONS: Our study found clinical and immunological differences between younger and older adults, in a cohort of Africans starting antiretroviral therapy. Further investigations should explore how these differences could be used to ensure equity in service delivery and affect outcomes of antiretroviral therapy.

Picat M-Q, Lewis J, Musiime V, Prendergast A, Nathoo K, Kekitiinwa A, Nahirya Ntege P, Gibb DM, Thiebaut R, Walker AS et al. 2013. Predicting patterns of long-term CD4 reconstitution in HIV-infected children starting antiretroviral therapy in sub-Saharan Africa: a cohort-based modelling study. PLoS Med, 10 (10), pp. e1001542. | Show Abstract | Read more

BACKGROUND: Long-term immune reconstitution on antiretroviral therapy (ART) has important implications for HIV-infected children, who increasingly survive into adulthood. Children's response to ART differs from adults', and better descriptive and predictive models of reconstitution are needed to guide policy and direct research. We present statistical models characterising, qualitatively and quantitatively, patterns of long-term CD4 recovery. METHODS AND FINDINGS: CD4 counts every 12 wk over a median (interquartile range) of 4.0 (3.7, 4.4) y in 1,206 HIV-infected children, aged 0.4-17.6 y, starting ART in the Antiretroviral Research for Watoto trial (ISRCTN 24791884) were analysed in an exploratory analysis supplementary to the trial's pre-specified outcomes. Most (n = 914; 76%) children's CD4 counts rose quickly on ART to a constant age-corrected level. Using nonlinear mixed-effects models, higher long-term CD4 counts were predicted for children starting ART younger, and with higher CD4 counts (p<0.001). These results suggest that current World Health Organization-recommended CD4 thresholds for starting ART in children ≥5 y will result in lower CD4 counts in older children when they become adults, such that vertically infected children who remain ART-naïve beyond 10 y of age are unlikely ever to normalise CD4 count, regardless of CD4 count at ART initiation. CD4 profiles with four qualitatively distinct reconstitution patterns were seen in the remaining 292 (24%) children. Study limitations included incomplete viral load data, and that the uncertainty in allocating children to distinct reconstitution groups was not modelled. CONCLUSIONS: Although younger ART-naïve children are at high risk of disease progression, they have good potential for achieving high CD4 counts on ART in later life provided ART is initiated following current World Health Organization (WHO), Paediatric European Network for Treatment of AIDS, or US Centers for Disease Control and Prevention guidelines. In contrast, to maximise CD4 reconstitution in treatment-naïve children >10 y, ART should ideally be considered even if there is a low risk of immediate disease progression. Further exploration of the immunological mechanisms for these CD4 recovery profiles should help guide management of paediatric HIV infection and optimise children's immunological development. Please see later in the article for the Editors' Summary.

Eyre DW, Cule ML, Wilson DJ, Griffiths D, Vaughan A, O'Connor L, Ip CLC, Golubchik T, Batty EM, Finney JM et al. 2013. Diverse sources of C. difficile infection identified on whole-genome sequencing. N Engl J Med, 369 (13), pp. 1195-1205. | Show Abstract | Read more

BACKGROUND: It has been thought that Clostridium difficile infection is transmitted predominantly within health care settings. However, endemic spread has hampered identification of precise sources of infection and the assessment of the efficacy of interventions. METHODS: From September 2007 through March 2011, we performed whole-genome sequencing on isolates obtained from all symptomatic patients with C. difficile infection identified in health care settings or in the community in Oxfordshire, United Kingdom. We compared single-nucleotide variants (SNVs) between the isolates, using C. difficile evolution rates estimated on the basis of the first and last samples obtained from each of 145 patients, with 0 to 2 SNVs expected between transmitted isolates obtained less than 124 days apart, on the basis of a 95% prediction interval. We then identified plausible epidemiologic links among genetically related cases from data on hospital admissions and community location. RESULTS: Of 1250 C. difficile cases that were evaluated, 1223 (98%) were successfully sequenced. In a comparison of 957 samples obtained from April 2008 through March 2011 with those obtained from September 2007 onward, a total of 333 isolates (35%) had no more than 2 SNVs from at least 1 earlier case, and 428 isolates (45%) had more than 10 SNVs from all previous cases. Reductions in incidence over time were similar in the two groups, a finding that suggests an effect of interventions targeting the transition from exposure to disease. Of the 333 patients with no more than 2 SNVs (consistent with transmission), 126 patients (38%) had close hospital contact with another patient, and 120 patients (36%) had no hospital or community contact with another patient. Distinct subtypes of infection continued to be identified throughout the study, which suggests a considerable reservoir of C. difficile. CONCLUSIONS: Over a 3-year period, 45% of C. difficile cases in Oxfordshire were genetically distinct from all previous cases. Genetically diverse sources, in addition to symptomatic patients, play a major part in C. difficile transmission. (Funded by the U.K. Clinical Research Collaboration Translational Infection Research Initiative and others.).

Kiwuwa-Muyingo S, Oja H, Walker A, Ilmonen P, Levin J, Mambule, Reid A, Mugyenyi P, Todd J, DART Trial team. 2013. Dynamic logistic regression model and population attributable fraction to investigate the association between adherence, missed visits and mortality: a study of HIV-infected adults surviving the first year of ART. BMC Infect Dis, 13 (1), pp. 395. | Show Abstract | Read more

BACKGROUND: Adherence is one of the most important determinants of viral suppression and drug resistance in HIV-infected people receiving antiretroviral therapy (ART). METHODS: We examined the association between long-term mortality and poor adherence to ART in DART trial participants in Uganda and Zimbabwe randomly assigned to receive laboratory and clinical monitoring (LCM), or clinically driven monitoring (CDM). Since over 50% of all deaths in the DART trial occurred during the first year on ART, we focussed on participants continuing ART for 12 months to investigate the implications of longer-term adherence to treatment on mortality. Participants' ART adherence was assessed by pill counts and structured questionnaires at 4-weekly clinic visits. We studied the effect of recent adherence history on the risk of death at the individual level (odds ratios from dynamic logistic regression model), and on mortality at the population level (population attributable fraction based on this model). Analyses were conducted separately for both randomization groups, adjusted for relevant confounding factors. Adherence behaviour was also confounded by a partial factorial randomization comparing structured treatment interruptions (STI) with continuous ART (CT). RESULTS: In the CDM arm a significant association was found between poor adherence to ART in the previous 3-9 months with increased mortality risk. In the LCM arm the association was not significant. The odds ratios for mortality in participants with poor adherence against those with optimal adherence was 1.30 (95% CI 0.78,2.10) in the LCM arm and 2.18 (1.47,3.22) in the CDM arm. The estimated proportions of deaths that could have been avoided with optimal adherence (population attributable fraction) in the LCM and CDM groups during the 5 years follow-up period were 16.0% (95% CI 0.7%,31.6%) and 33.1% (20.5%,44.8%), correspondingly. CONCLUSIONS: Recurrent poor adherence determined even through simple measures is associated with high mortality both at individual level as well as at the ART programme level. The number of lives saved through effective interventions to improve adherence could be considerable particularly for individuals monitored without using CD4 cell counts. The findings have important implications for clinical practice and for developing interventions to enhance adherence.

Walker AS, Eyre DW, Crook DW, Wilcox MH, Peto TEA. 2013. Regarding "Clostridium difficile ribotype does not predict severe infection". Clin Infect Dis, 56 (12), pp. 1845-1846. | Read more

Batty EM, Wong THN, Trebes A, Argoud K, Attar M, Buck D, Ip CLC, Golubchik T, Cule M, Bowden R et al. 2013. A modified RNA-Seq approach for whole genome sequencing of RNA viruses from faecal and blood samples. PLoS One, 8 (6), pp. e66129. | Show Abstract | Read more

To date, very large scale sequencing of many clinically important RNA viruses has been complicated by their high population molecular variation, which creates challenges for polymerase chain reaction and sequencing primer design. Many RNA viruses are also difficult or currently not possible to culture, severely limiting the amount and purity of available starting material. Here, we describe a simple, novel, high-throughput approach to Norovirus and Hepatitis C virus whole genome sequence determination based on RNA shotgun sequencing (also known as RNA-Seq). We demonstrate the effectiveness of this method by sequencing three Norovirus samples from faeces and two Hepatitis C virus samples from blood, on an Illumina MiSeq benchtop sequencer. More than 97% of reference genomes were recovered. Compared with Sanger sequencing, our method had no nucleotide differences in 14,019 nucleotides (nt) for Noroviruses (from a total of 2 Norovirus genomes obtained with Sanger sequencing), and 8 variants in 9,542 nt for Hepatitis C virus (1 variant per 1,193 nt). The three Norovirus samples had 2, 3, and 2 distinct positions called as heterozygous, while the two Hepatitis C virus samples had 117 and 131 positions called as heterozygous. To confirm that our sample and library preparation could be scaled to true high-throughput, we prepared and sequenced an additional 77 Norovirus samples in a single batch on an Illumina HiSeq 2000 sequencer, recovering >90% of the reference genome in all but one sample. No discrepancies were observed across 118,757 nt compared between Sanger and our custom RNA-Seq method in 16 samples. By generating viral genomic sequences that are not biased by primer-specific amplification or enrichment, this method offers the prospect of large-scale, affordable studies of RNA viruses which could be adapted to routine diagnostic laboratory workflows in the near future, with the potential to directly characterize within-host viral diversity.

Stoesser N, Batty EM, Eyre DW, Morgan M, Wyllie DH, Del Ojo Elias C, Johnson JR, Walker AS, Peto TEA, Crook DW. 2013. Predicting antimicrobial susceptibilities for Escherichia coli and Klebsiella pneumoniae isolates using whole genomic sequence data. J Antimicrob Chemother, 68 (10), pp. 2234-2244. | Show Abstract | Read more

OBJECTIVES: Whole-genome sequencing potentially represents a single, rapid and cost-effective approach to defining resistance mechanisms and predicting phenotype, and strain type, for both clinical and epidemiological purposes. This retrospective study aimed to determine the efficacy of whole genome-based antimicrobial resistance prediction in clinical isolates of Escherichia coli and Klebsiella pneumoniae. METHODS: Seventy-four E. coli and 69 K. pneumoniae bacteraemia isolates from Oxfordshire, UK, were sequenced (Illumina HiSeq 2000). Resistance phenotypes were predicted from genomic sequences using BLASTn-based comparisons of de novo-assembled contigs with a study database of >100 known resistance-associated loci, including plasmid-associated and chromosomal genes. Predictions were made for seven commonly used antimicrobials: amoxicillin, co-amoxiclav, ceftriaxone, ceftazidime, ciprofloxacin, gentamicin and meropenem. Comparisons were made with phenotypic results obtained in duplicate by broth dilution (BD Phoenix). Discrepancies, either between duplicate BD Phoenix results or between genotype and phenotype, were resolved with gradient diffusion analyses. RESULTS: A wide variety of antimicrobial resistance genes were identified, including blaCTX-M, blaLEN, blaOKP, blaOXA, blaSHV, blaTEM, aac(3')-Ia, aac-(3')-IId, aac-(3')-IIe, aac(6')-Ib-cr, aadA1a, aadA4, aadA5, aadA16, aph(6')-Id, aph(3')-Ia, qnrB and qnrS, as well as resistance-associated mutations in chromosomal gyrA and parC genes. The sensitivity of genome-based resistance prediction across all antibiotics for both species was 0.96 (95% CI: 0.94-0.98) and the specificity was 0.97 (95% CI: 0.95-0.98). Very major and major error rates were 1.2% and 2.1%, respectively. CONCLUSIONS: Our method was as sensitive and specific as routinely deployed phenotypic methods. Validation against larger datasets and formal assessments of cost and turnaround time in a routine laboratory setting are warranted.

Walker AS, Eyre DW, Crook DW, Peto TEA, Wilcox MH. 2013. Reply to Walk et al. Clin Infect Dis, 57 (4), pp. 626-627. | Read more

Eyre DW, Walker AS, Freeman J, Baines SD, Fawley WN, Chilton CH, Griffiths D, Vaughan A, Crook DW, Peto TEA, Wilcox MH. 2013. Short-term genome stability of serial Clostridium difficile ribotype 027 isolates in an experimental gut model and recurrent human disease. PLoS One, 8 (5), pp. e63540. | Show Abstract | Read more

BACKGROUND: Clostridium difficile whole genome sequencing has the potential to identify related isolates, even among otherwise indistinguishable strains, but interpretation depends on understanding genomic variation within isolates and individuals. METHODS: Serial isolates from two scenarios were whole genome sequenced. Firstly, 62 isolates from 29 timepoints from three in vitro gut models, inoculated with a NAP1/027 strain. Secondly, 122 isolates from 44 patients (2-8 samples/patient) with mostly recurrent/on-going symptomatic NAP-1/027 C. difficile infection. Reference-based mapping was used to identify single nucleotide variants (SNVs). RESULTS: Across three gut model inductions, two with antibiotic treatment, total 137 days, only two new SNVs became established. Pre-existing minority SNVs became dominant in two models. Several SNVs were detected, only present in the minority of colonies at one/two timepoints. The median (inter-quartile range) [range] time between patients' first and last samples was 60 (29.5-118.5) [0-561] days. Within-patient C. difficile evolution was 0.45 SNVs/called genome/year (95%CI 0.00-1.28) and within-host diversity was 0.28 SNVs/called genome (0.05-0.53). 26/28 gut model and patient SNVs were non-synonymous, affecting a range of gene targets. CONCLUSIONS: The consistency of whole genome sequencing data from gut model C. difficile isolates, and the high stability of genomic sequences in isolates from patients, supports the use of whole genome sequencing in detailed transmission investigations.

Eyre DW, Cule ML, Griffiths D, Crook DW, Peto TEA, Walker AS, Wilson DJ. 2013. Detection of mixed infection from bacterial whole genome sequence data allows assessment of its role in Clostridium difficile transmission. PLoS Comput Biol, 9 (5), pp. e1003059. | Show Abstract | Read more

Bacterial whole genome sequencing offers the prospect of rapid and high precision investigation of infectious disease outbreaks. Close genetic relationships between microorganisms isolated from different infected cases suggest transmission is a strong possibility, whereas transmission between cases with genetically distinct bacterial isolates can be excluded. However, undetected mixed infections-infection with ≥2 unrelated strains of the same species where only one is sequenced-potentially impairs exclusion of transmission with certainty, and may therefore limit the utility of this technique. We investigated the problem by developing a computationally efficient method for detecting mixed infection without the need for resource-intensive independent sequencing of multiple bacterial colonies. Given the relatively low density of single nucleotide polymorphisms within bacterial sequence data, direct reconstruction of mixed infection haplotypes from current short-read sequence data is not consistently possible. We therefore use a two-step maximum likelihood-based approach, assuming each sample contains up to two infecting strains. We jointly estimate the proportion of the infection arising from the dominant and minor strains, and the sequence divergence between these strains. In cases where mixed infection is confirmed, the dominant and minor haplotypes are then matched to a database of previously sequenced local isolates. We demonstrate the performance of our algorithm with in silico and in vitro mixed infection experiments, and apply it to transmission of an important healthcare-associated pathogen, Clostridium difficile. Using hospital ward movement data in a previously described stochastic transmission model, 15 pairs of cases enriched for likely transmission events associated with mixed infection were selected. Our method identified four previously undetected mixed infections, and a previously undetected transmission event, but no direct transmission between the pairs of cases under investigation. These results demonstrate that mixed infections can be detected without additional sequencing effort, and this will be important in assessing the extent of cryptic transmission in our hospitals.

ARROW Trial team. 2013. Routine versus clinically driven laboratory monitoring and first-line antiretroviral therapy strategies in African children with HIV (ARROW): a 5-year open-label randomised factorial trial. Lancet, 381 (9875), pp. 1391-1403. | Show Abstract | Read more

BACKGROUND: No trials have investigated routine laboratory monitoring for children with HIV, nor four-drug induction strategies to increase durability of first-line antiretroviral therapy (ART). METHODS: In this open-label parallel-group trial, Ugandan and Zimbabwean children or adolescents with HIV, aged 3 months to 17 years and eligible for ART, were randomly assigned in a factorial design. Randomisation was to either clinically driven monitoring or routine laboratory and clinical monitoring for toxicity (haematology and biochemistry) and efficacy (CD4 cell counts; non-inferiority monitoring randomisation); and simultaneously to standard three-drug or to four-drug induction first-line ART, in three groups: three-drug treatment (non-nucleoside reverse transcriptase inhibitor [NNRTI], lamivudine, abacavir; group A) versus four-drug induction (NNRTI, lamivudine, abacavir, zidovudine; groups B and C), decreasing after week 36 to three-drug NNRTI, lamivudine, plus abacavir (group B) or lamivudine, abacavir, plus zidovudine (group C; superiority ART-strategy randomisation). For patients assigned to routine laboratory monitoring, results were returned every 12 weeks to clinicians; for clinically driven monitoring, toxicity results were only returned for requested clinical reasons or if grade 4. Children switched to second-line ART for WHO stage 3 or 4 events or (routine laboratory monitoring only) age-dependent WHO CD4 criteria. Randomisation used computer-generated sequentially numbered tables incorporated securely within the database. Primary efficacy endpoints were new WHO stage 4 events or death for monitoring and change in CD4 percentage at 72 and 144 weeks for ART-strategy randomisations; the co-primary toxicity endpoint was grade 3 or 4 adverse events. Analysis was by intention to treat. This trial is registered, ISRCTN24791884. FINDINGS: 1206 children were randomly assigned to clinically driven (n=606) versus routine laboratory monitoring (n=600), and groups A (n=397), B (n=404), and C (n=405). 47 (8%) children on clinically driven monitoring versus 39 (7%) on routine laboratory monitoring had a new WHO stage 4 event or died (hazard ratio [HR] 1·13, 95% CI 0·73-1·73, p=0·59; non-inferiority criterion met). However, in years 2-5, rates were higher in children on clinically driven monitoring (1·3 vs 0·4 per 100 child-years, difference 0·99, 0·37-1·60, p=0·002). One or more grade 3 or 4 adverse events occurred in 283 (47%) children on clinically driven versus 282 (47%) on routine laboratory monitoring (HR 0·98, 0·83-1·16, p=0·83). Mean CD4 percentage change did not differ between ART groups at week 72 (16·5% [SD 8·6] vs 17·1% [8·5] vs 17·3% [8·0], p=0·33) or week 144 (p=0·69), but four-drug groups (B, C) were superior to three-drug group A at week 36 (12·4% [7·2] vs 14·1% [7·1] vs 14·6% [7·3], p<0·0001). Excess grade 3 or 4 events in groups B (one or more events reported by 157 [40%] children in A, 190 [47%] in B; HR [B:A] 1·32, 1·07-1·63) and C (218 [54%] children in C; HR [C:A] 1·58, 1·29-1·94; global p=0·0001) were driven by asymptomatic neutropenia in zidovudine-containing groups (B, C; 86 group A, 133 group B, 184 group C), but resulted in drug substitutions in only zero versus two versus four children, respectively. INTERPRETATION: NNRTI plus NRTI-based three-drug or four-drug ART can be given across childhood without routine toxicity monitoring; CD4 monitoring provided clinical benefit after the first year on ART, but event rates were very low and long-term survival high, suggesting ART rollout should take priority. CD4 benefits from four-drug induction were not durable, but three-NRTI long-term maintenance was immunologically and clinically similar to NNRTI-based ART and could be valuable during tuberculosis co-treatment. FUNDING: UK Medical Research Council, the UK Department for International Development; drugs donated and viral load assays funded by ViiV Healthcare and GlaxoSmithKline.

Fillekes Q, Mulenga V, Kabamba D, Kankasa C, Thomason MJ, Cook A, Chintu C, Gibb DM, Walker AS, Burger DM, CHAPAS-1 trial team. 2013. Is nevirapine dose-escalation appropriate in young, African, HIV-infected children? AIDS, 27 (13), pp. 2111-2115. | Show Abstract | Read more

OBJECTIVES: Young children metabolize nevirapine faster than older children/adults. We evaluated nevirapine pharmacokinetics with or without dose-escalation in Zambian, HIV-infected infants/children and its relationship with safety/efficacy. DESIGN: A retrospective pharmacokinetic substudy of the CHAPAS-1 trial. METHODS: HIV-infected, Zambian children were randomized to initiate antiretroviral therapy (ART) with full-dose twice-daily nevirapine versus 2-week nevirapine dose-escalation. Samples taken 3-4 h postmorning-dose 2 weeks after nevirapine initiation were assayed for nevirapine levels. Viral load was measured on available samples at weeks 4 and 48; adverse events were prospectively reported. RESULTS: Of 162 (77%) children with week-2 samples, 79 (49%) were randomized to nevirapine dose-escalation. At ART initiation, median [interquartile range (IQR)] age, weight and CD4% were 5.2 (1.5-8.7) years, 13.0 (8.1-19.0) kg and 13 (8-18)%, respectively; 81 (50%) were male. With full dose, few children aged less than 2 years (3/23, 13%) or more than 2 years (4/60, 7%) had subtherapeutic nevirapine levels (defined as <3.0 mg/l), but with dose-escalation, seven out of 22 (32%) aged less than 2 years versus seven out of 57 (12%) more than 2 years had subtherapeutic nevirapine levels (P=0.05). There was no difference between week-2 nevirapine levels in those with viral load more than 250 versus less than 250 copies/ml at week 4 (P=0.97) or week 48 (P=0.40). Eleven out of 162 children had grade 1/2 rash; all were more than 2 years of age (P=0.04), and 10 were randomized to full dose. CONCLUSION: Subtherapeutic nevirapine levels 3-4 h postdose were more frequent in young children on dose-escalation. Younger children were at lower risk for rash. To simplify ART initiation and reduce the risk of suboptimal dosing, full-dose nevirapine at ART initiation should be considered for African HIV-infected children less than 2 years of age.

Thompson AGB, Lowe J, Fox Z, Lukic A, Porter M-C, Ford L, Gorham M, Gopalakrishnan GS, Rudge P, Walker AS et al. 2013. The Medical Research Council prion disease rating scale: a new outcome measure for prion disease therapeutic trials developed and validated using systematic observational studies. Brain, 136 (Pt 4), pp. 1116-1127. | Show Abstract | Read more

Progress in therapeutics for rare disorders like prion disease is impeded by the lack of validated outcome measures and a paucity of natural history data derived from prospective observational studies. The first analysis of the U.K. National Prion Monitoring Cohort involved 1337 scheduled clinical assessments and 479 telephone assessments in 437 participants over 373 patient-years of follow-up. Scale development has included semi-quantitative and qualitative carer interviews, item response modelling (Rasch analysis), inter-rater reliability testing, construct analysis and correlation with several existing scales. The proposed 20-point Medical Research Council prion disease rating scale assesses domains of cognitive function, speech, mobility, personal care/feeding and continence, according to their relative importance documented by carer interviews. It is quick and simple to administer, and has been validated for use by doctors and nurses and for use over the telephone, allowing for frequent assessments that capture the rapid change typical of these diseases. The Medical Research Council Scale correlates highly with widely used cognitive and single item scales, but has substantial advantages over these including minimal floor effects. Three clear patterns of decline were observed using the scale: fast linear decline, slow linear decline (usually inherited prion disease) and in some patients, decline followed by a prolonged preterminal plateau at very low functional levels. Rates of decline and progress through milestones measured using the scale vary between sporadic, acquired and inherited prion diseases following clinical expectations. We have developed and validated a new functionally-oriented outcome measure and propose that future clinical trials in prion disease should collect data compatible with this scale, to allow for combined and comparative analyses. Such approaches may be advantageous in orphan conditions, where single studies of feasible duration will often struggle to achieve statistical power.

Walker AS, Eyre DW, Wyllie DH, Dingle KE, Griffiths D, Shine B, Oakley S, O'Connor L, Finney J, Vaughan A et al. 2013. Relationship between bacterial strain type, host biomarkers, and mortality in Clostridium difficile infection. Clin Infect Dis, 56 (11), pp. 1589-1600. | Show Abstract | Read more

BACKGROUND: Despite substantial interest in biomarkers, their impact on clinical outcomes and variation with bacterial strain has rarely been explored using integrated databases. METHODS: From September 2006 to May 2011, strains isolated from Clostridium difficile toxin enzyme immunoassay (EIA)-positive fecal samples from Oxfordshire, United Kingdom (approximately 600,000 people) underwent multilocus sequence typing. Fourteen-day mortality and levels of 15 baseline biomarkers were compared between consecutive C. difficile infections (CDIs) from different clades/sequence types (STs) and EIA-negative controls using Cox and normal regression adjusted for demographic/clinical factors. RESULTS: Fourteen-day mortality was 13% in 2222 adults with 2745 EIA-positive samples (median, 78 years) vs 5% in 20,722 adults with 27,550 EIA-negative samples (median, 74 years) (absolute attributable mortality, 7.7%; 95% CI, 6.4%-9.0%). Mortality was highest in clade 5 CDIs (25% [16 of 63]; polymerase chain reaction (PCR) ribotype 078/ST 11), then clade 2 (20% [111 of 560]; 99% PCR ribotype 027/ST 1) versus clade 1 (12% [137 of 1168]; adjusted P < .0001). Within clade 1, 14-day mortality was only 4% (3 of 84) in ST 44 (PCR ribotype 015) (adjusted P = .05 vs other clade 1). Mean baseline neutrophil counts also varied significantly by genotype: 12.4, 11.6, and 9.5 × 10(9) neutrophils/L for clades 5, 2 and 1, respectively, vs 7.0 × 10(9) neutrophils/L in EIA-negative controls (P < .0001) and 7.9 × 10(9) neutrophils/L in ST 44 (P = .08). There were strong associations between C. difficile-type-specific effects on mortality and neutrophil/white cell counts (rho = 0.48), C-reactive-protein (rho = 0.43), eosinophil counts (rho = -0.45), and serum albumin (rho = -0.47). Biomarkers predicted 30%-40% of clade-specific mortality differences. CONCLUSIONS: C. difficile genotype predicts mortality, and excess mortality correlates with genotype-specific changes in biomarkers, strongly implicating inflammatory pathways as a major influence on poor outcome after CDI. PCR ribotype 078/ST 11 (clade 5) leads to severe CDI; thus ongoing surveillance remains essential.

Ammerlaan HSM, Harbarth S, Buiting AGM, Crook DW, Fitzpatrick F, Hanberger H, Herwaldt LA, van Keulen PHJ, Kluytmans JAJW, Kola A et al. 2013. Secular trends in nosocomial bloodstream infections: antibiotic-resistant bacteria increase the total burden of infection. Clin Infect Dis, 56 (6), pp. 798-805. | Show Abstract | Read more

BACKGROUND: It is unknown whether rising incidence rates of nosocomial bloodstream infections (BSIs) caused by antibiotic-resistant bacteria (ARB) replace antibiotic-susceptible bacteria (ASB), leaving the total BSI rate unaffected. METHODS: We investigated temporal trends in annual incidence densities (events per 100 000 patient-days) of nosocomial BSIs caused by methicillin-resistant Staphylococcus aureus (MRSA), ARB other than MRSA, and ASB in 7 ARB-endemic and 7 ARB-nonendemic hospitals between 1998 and 2007. RESULTS: 33 130 nosocomial BSIs (14% caused by ARB) yielded 36 679 microorganisms. From 1998 to 2007, the MRSA incidence density increased from 0.2 to 0.7 (annual increase, 22%) in ARB-nonendemic hospitals, and from 3.1 to 11.7 (annual increase, 10%) in ARB-endemic hospitals (P = .2), increasing the incidence density difference between ARB-endemic and ARB-nonendemic hospitals from 2.9 to 11.0. The non-MRSA ARB incidence density increased from 2.8 to 4.1 (annual increase, 5%) in ARB-nonendemic hospitals, and from 1.5 to 17.4 (annual increase, 22%) in ARB-endemic hospitals (P < .001), changing the incidence density difference from -1.3 to 13.3. Trends in ASB incidence densities were similar in both groups (P = .7). With annual increases of 3.8% and 5.4% of all nosocomial BSIs in ARB-nonendemic and ARB-endemic hospitals, respectively (P < .001), the overall incidence density difference of 3.8 increased to 24.4. CONCLUSIONS: Increased nosocomial BSI rates due to ARB occur in addition to infections caused by ASB, increasing the total burden of disease. Hospitals with high ARB infection rates in 2005 had an excess burden of BSI of 20.6 per 100 000 patient-days in a 10-year period, mainly caused by infections with ARB.

Stoesser NE, Martin J, Mawer D, Eyre DW, Walker AS, Peto TEA, Crook DW, Wilcox MH. 2013. Risk factors for Clostridium difficile acquisition in infants: importance of study design. Clin Infect Dis, 56 (11), pp. 1680-1681. | Read more

Gilks CF, Walker AS, Munderi P, Kityo C, Reid A, Katabira E, Goodall RL, Grosskurth H, Mugyenyi P, Hakim J et al. 2013. A single CD4 test with 250 cells/mm3 threshold predicts viral suppression in HIV-infected adults failing first-line therapy by clinical criteria. PLoS One, 8 (2), pp. e57580. | Show Abstract | Read more

BACKGROUND: In low-income countries, viral load (VL) monitoring of antiretroviral therapy (ART) is rarely available in the public sector for HIV-infected adults or children. Using clinical failure alone to identify first-line ART failure and trigger regimen switch may result in unnecessary use of costly second-line therapy. Our objective was to identify CD4 threshold values to confirm clinically-determined ART failure when VL is unavailable. METHODS: 3316 HIV-infected Ugandan/Zimbabwean adults were randomised to first-line ART with Clinically-Driven (CDM, CD4s measured but blinded) or routine Laboratory and Clinical Monitoring (LCM, 12-weekly CD4s) in the DART trial. CD4 at switch and ART failure criteria (new/recurrent WHO 4, single/multiple WHO 3 event; LCM: CD4<100 cells/mm(3)) were reviewed in 361 LCM, 314 CDM participants who switched over median 5 years follow-up. Retrospective VLs were available in 368 (55%) participants. RESULTS: Overall, 265/361 (73%) LCM participants failed with CD4<100 cells/mm(3); only 7 (2%) switched with CD4≥250 cells/mm(3), four switches triggered by WHO events. Without CD4 monitoring, 207/314 (66%) CDM participants failed with WHO 4 events, and 77(25%)/30(10%) with single/multiple WHO 3 events. Failure/switching with single WHO 3 events was more likely with CD4≥250 cells/mm(3) (28/77; 36%) (p = 0.0002). CD4 monitoring reduced switching with viral suppression: 23/187 (12%) LCM versus 49/181 (27%) CDM had VL<400 copies/ml at failure/switch (p<0.0001). Amongst CDM participants with CD4<250 cells/mm(3) only 11/133 (8%) had VL<400 copies/ml, compared with 38/48 (79%) with CD4≥250 cells/mm(3) (p<0.0001). CONCLUSION: Multiple, but not single, WHO 3 events predicted first-line ART failure. A CD4 threshold 'tiebreaker' of ≥250 cells/mm(3) for clinically-monitored patients failing first-line could identify ∼80% with VL<400 copies/ml, who are unlikely to benefit from second-line. Targeting CD4s to single WHO stage 3 'clinical failures' would particularly avoid premature, costly switch to second-line ART.

Donegan K, Doerholt K, Judd A, Lyall H, Menson E, Butler K, Tookey P, Riordan A, Shingadia D, Tudor-Williams G et al. 2013. Lopinavir dosing in HIV-infected children in the United Kingdom and Ireland. Pediatr Infect Dis J, 32 (1), pp. 45-50. | Show Abstract | Read more

BACKGROUND: Uncertainty surrounds the correct dosing of lopinavir/r (LPV/r) in HIV-infected children not receiving non-nucleoside reverse transcriptase inhibitors. The licensed total daily dose is 460 mg/m², whereas the original study, reporting excellent viral load (VL) suppression, used a higher 600 mg/m² dose. METHODS: We calculated LPV/r daily doses prescribed from 2000 to 2009 within the UK/Irish national Collaborative HIV Paediatric Study (CHIPS) cohort. Logistic and binomial mixed models were used to explore whether higher LPV/r doses affected VL suppression. RESULTS: Four hundred forty-four of 1201 (37%) children on antiretroviral therapy in CHIPS had taken lopinavir/r without non-nucleoside reverse transcriptase inhibitors. Of 1065 recorded doses, 48% were syrup, 27% capsules and 25% tablets. Ten percent of doses were >10% below 460 mg/m² per day, and 12% were >10% above 600 mg/m². In multivariable models, predictors of lower doses were: once versus twice daily dosing (32 mg/m² lower); syrup versus tablets/capsules (33 mg/m² lower); higher weight-for-age and height-for-age (24 mg/m² and 13 mg/m² lower per unit higher, respectively); and older age (13 mg/m lower per year older for those aged >10 years, P < 0.05). Dosing varied widely by hospital (P = 0.0004), with some targeting higher and others lower doses. For those receiving lopinavir/r for ≥6 months, there was a greater chance of VL <400 copies/mL with higher doses (odds ratio = 1.15 [95% confidence interval: 1.06-1.25 per 50 mg/m² higher], P = 0.001). CONCLUSIONS: Findings suggest substantial variation and large hospital-level effects in the LPV/r dose prescribed to HIV-infected children in the United Kingdom/Ireland. Higher doses appeared to improve long-term VL suppression, which may be critical in children who need life-long therapy. Results highlight the importance of optimizing dosing in HIV-infected children of all ages.

Didelot X, Eyre DW, Cule M, Ip CLC, Ansari MA, Griffiths D, Vaughan A, O'Connor L, Golubchik T, Batty EM et al. 2012. Microevolutionary analysis of Clostridium difficile genomes to investigate transmission. Genome Biol, 13 (12), pp. R118. | Show Abstract | Read more

BACKGROUND: The control of Clostridium difficile infection is a major international healthcare priority, hindered by a limited understanding of transmission epidemiology for these bacteria. However, transmission studies of bacterial pathogens are rapidly being transformed by the advent of next generation sequencing. RESULTS: Here we sequence whole C. difficile genomes from 486 cases arising over four years in Oxfordshire. We show that we can estimate the times back to common ancestors of bacterial lineages with sufficient resolution to distinguish whether direct transmission is plausible or not. Time depths were inferred using a within-host evolutionary rate that we estimated at 1.4 mutations per genome per year based on serially isolated genomes. The subset of plausible transmissions was found to be highly associated with pairs of patients sharing time and space in hospital. Conversely, the large majority of pairs of genomes matched by conventional typing and isolated from patients within a month of each other were too distantly related to be direct transmissions. CONCLUSIONS: Our results confirm that nosocomial transmission between symptomatic C. difficile cases contributes far less to current rates of infection than has been widely assumed, which clarifies the importance of future research into other transmission routes, such as from asymptomatic carriers. With the costs of DNA sequencing rapidly falling and its use becoming more and more widespread, genomics will revolutionize our understanding of the transmission of bacterial pathogens.

Gilks CF, Walker AS, Dunn DT, Gibb DM, Kikaire B, Reid A, Musana H, Mambule I, Kasirye R, Robertson V et al. 2012. Lopinavir/ritonavir monotherapy after 24 weeks of second-line antiretroviral therapy in Africa: a randomized controlled trial (SARA). Antivir Ther, 17 (7), pp. 1363-1373. | Show Abstract | Read more

BACKGROUND: Boosted protease inhibitor (bPI) monotherapy (bPImono) potentially has substantial cost, safety and operational benefits. It has never been evaluated as second-line antiretroviral therapy (ART) in Africa. METHODS: After 24 weeks of lopinavir/ritonavir-containing second-line therapy, DART participants were randomized to remain on combination therapy (CT), or change to bPImono maintenance (SARA trial; ISRCTN53817258). Joint primary end points were CD4(+) T-cell changes 24 weeks later and serious adverse events (SAEs); retrospectively assayed viral load (VL) was a secondary end point. Analyses were intention-to-treat. RESULTS: A total of 192 participants were randomized to CT (n=95) or bPImono (n=97) and followed for median 60 weeks (IQR 45-84). Participants received median 4.0 years (IQR 3.5-4.4) first-line ART. Median CD4(+) T-cell count at first-line failure was 86 cells/mm(3) (47-136), increasing to 245 cells/mm(3) (173-325) after 24-week induction when 77% had VL<50 copies/ml. Overall, 44 (23%) were receiving second-line therapy with bPI and nucleoside reverse transcriptase inhibitors (NRTI) only, and 148 (77%) with bPI plus non-NRTI (NNRTI) with or without NRTI. At 24 weeks after randomization to CT versus bPImono, mean CD4(+) T-cell increase was 42 (CT, n=85) versus 49 cells/mm(3) (bPImono, n=88; adjusted difference 13 [95% CI -15, 43], P=0.37; non-inferior compared with predetermined non-inferiority margin [-33]). Virological suppression was greater for CT versus bPImono (trend P=0.009): 77% (70/91) versus 60% (56/94) were <50 copies/ml, and 5% (5) versus 14% (13) were ≥1,000 copies/ml, respectively. A total of 0 (0%) versus 5 (5%) participants had major protease inhibitor mutations and 3 (3%) versus 0 (0%) new NNRTI/NRTI mutations were detected during follow-up. Two participants (1 CT and 1 bPImono) died >24 weeks after randomization, and 5 (2 CT and 3 bPImono) experienced SAEs (P=0.51). CONCLUSIONS: bPImono following a 24-week second-line induction was associated with similar CD4(+) T-cell response, but increased low-level viraemia, generally without protease inhibitor resistance. Longer-term trials are needed to provide definitive evidence about effectiveness in Africa.

Thwaites G, Auckland C, Barlow G, Cunningham R, Davies G, Edgeworth J, Greig J, Hopkins S, Jeyaratnam D, Jenkins N et al. 2012. Adjunctive rifampicin to reduce early mortality from Staphylococcus aureus bacteraemia (ARREST): study protocol for a randomised controlled trial. Trials, 13 (1), pp. 241. | Show Abstract | Read more

BACKGROUND: Staphylococcus aureus bacteraemia is a common and serious infection, with an associated mortality of ~25%. Once in the blood, S. aureus can disseminate to infect almost any organ, but bones, joints and heart valves are most frequently affected. Despite the infection's severity, the evidence guiding optimal antibiotic therapy is weak: fewer than 1,500 patients have been included in 16 randomised controlled trials investigating S. aureus bacteraemia treatment. It is uncertain which antibiotics are most effective, their route of administration and duration, and whether antibiotic combinations are better than single agents. We hypothesise that adjunctive rifampicin, given in combination with a standard first-line antibiotic, will enhance killing of S. aureus early in the treatment course, sterilise infected foci and blood faster, and thereby reduce the risk of dissemination, metastatic infection and death. Our aim is to determine whether adjunctive rifampicin reduces all-cause mortality within 14 days and bacteriological failure or death within 12 weeks from randomisation. METHODS: We will perform a parallel group, randomised (1:1), blinded, placebo-controlled trial in NHS hospitals across the UK. Adults (≥ 18 years) with S. aureus (meticillin-susceptible or resistant) grown from at least one blood culture who have received ≤ 96 h of active antibiotic therapy for the current infection and do not have contraindications to the use of rifampicin will be eligible for inclusion. Participants will be randomised to adjunctive rifampicin (600-900 mg/day; orally or intravenously) or placebo for the first 14 days of therapy in combination with standard single-agent antibiotic therapy. The co-primary outcome measures will be all-cause mortality up to 14 days from randomisation and bacteriological failure/death (all-cause) up to 12 weeks from randomisation. 940 patients will be recruited, providing >80% power to detect 45% and 30% reductions in the two co-primary endpoints of death by 14 days and bacteriological failure/death by 12 weeks respectively. DISCUSSION: This pragmatic trial addresses the long-standing hypothesis that adjunctive rifampicin improves outcome from S. aureus bacteraemia through enhanced early bacterial killing. If proven correct, it will provide a paradigm through which further improvements in outcome from S. aureus bacteraemia can be explored.

Walker AS, Prendergast AJ, Mugyenyi P, Munderi P, Hakim J, Kekitiinwa A, Katabira E, Gilks CF, Kityo C, Nahirya-Ntege P et al. 2012. Mortality in the year following antiretroviral therapy initiation in HIV-infected adults and children in Uganda and Zimbabwe. Clin Infect Dis, 55 (12), pp. 1707-1718. | Show Abstract | Read more

BACKGROUND: Adult mortality in the first 3 months on antiretroviral therapy (ART) is higher in low-income than in high-income countries, with more similar mortality after 6 months. However, the specific patterns of changing risk and causes of death have rarely been investigated in adults, nor compared with children in low-income countries. METHODS: We used flexible parametric hazard models to investigate how mortality risks varied over the first year on ART in human immunodeficiency virus-infected adults (aged 18-73 years) and children (aged 4 months to 15 years) in 2 trials in Zimbabwe and Uganda. RESULTS: One hundred seventy-nine of 3316 (5.4%) adults and 39 of 1199 (3.3%) children died; half of adult/pediatric deaths occurred in the first 3 months. Mortality variation over year 1 was similar; at all CD4 counts/CD4%, mortality risk was greatest between days 30 and 50, declined rapidly to day 180, then declined more slowly. One-year mortality after initiating ART with 0-49, 50-99 or ≥ 100 CD4 cells/μL was 9.4%, 4.5%, and 2.9%, respectively, in adults, and 10.1%, 4.4%, and 1.3%, respectively, in children aged 4-15 years. Mortality in children aged 4 months to 3 years initiating ART in equivalent CD4% strata was also similar (0%-4%: 9.1%; 5%-9%: 4.5%; ≥ 10%: 2.8%). Only 10 of 179 (6%) adult deaths and 1 of 39 (3%) child deaths were probably medication-related. The most common cause of death was septicemia/meningitis in adults (20%, median 76 days) and children (36%, median 79 days); pneumonia also commonly caused child deaths (28%, median 41 days). CONCLUSIONS: Children ≥ 4 years and adults with low CD4 values have remarkably similar, and high, mortality risks in the first 3 months after ART initiation in low-income countries, similar to cohorts of untreated individuals. Bacterial infections are a major cause of death in both adults and children; targeted interventions could have important benefits.

Dingle KE, Didelot X, Ansari MA, Eyre DW, Vaughan A, Griffiths D, Ip CLC, Batty EM, Golubchik T, Bowden R et al. 2013. Recombinational switching of the Clostridium difficile S-layer and a novel glycosylation gene cluster revealed by large-scale whole-genome sequencing. J Infect Dis, 207 (4), pp. 675-686. | Show Abstract | Read more

BACKGROUND: Clostridium difficile is a major cause of nosocomial diarrhea, with 30-day mortality reaching 30%. The cell surface comprises a paracrystalline proteinaceous S-layer encoded by the slpA gene within the cell wall protein (cwp) gene cluster. Our purpose was to understand the diversity and evolution of slpA and nearby genes also encoding immunodominant cell surface antigens. METHODS: Whole-genome sequences were determined for 57 C. difficile isolates representative of the population structure and different clinical phenotypes. Phylogenetic analyses were performed on their genomic region (>63 kb) spanning the cwp cluster. RESULTS: Genetic diversity across the cwp cluster peaked within slpA, cwp66 (adhesin), and secA2 (secretory translocase). These genes formed a 10-kb cassette, of which 12 divergent variants were found. Homologous recombination involving this cassette caused it to associate randomly with genotype. One cassette contained a novel insertion (length, approximately 24 kb) that resembled S-layer glycosylation gene clusters. CONCLUSIONS: Genetic exchange of S-layer cassettes parallels polysaccharide capsular switching in other species. Both cause major antigenic shifts, while the remainder of the genome is unchanged. C. difficile genotype is therefore not predictive of antigenic type. S-layer switching and immune escape could help explain temporal and geographic variation in C. difficile epidemiology and may inform genotyping and vaccination strategies.

Walker TM, Ip CLC, Harrell RH, Evans JT, Kapatai G, Dedicoat MJ, Eyre DW, Wilson DJ, Hawkey PM, Crook DW et al. 2013. Whole-genome sequencing to delineate Mycobacterium tuberculosis outbreaks: a retrospective observational study. Lancet Infect Dis, 13 (2), pp. 137-146. | Show Abstract | Read more

BACKGROUND: Tuberculosis incidence in the UK has risen in the past decade. Disease control depends on epidemiological data, which can be difficult to obtain. Whole-genome sequencing can detect microevolution within Mycobacterium tuberculosis strains. We aimed to estimate the genetic diversity of related M tuberculosis strains in the UK Midlands and to investigate how this measurement might be used to investigate community outbreaks. METHODS: In a retrospective observational study, we used Illumina technology to sequence M tuberculosis genomes from an archive of frozen cultures. We characterised isolates into four groups: cross-sectional, longitudinal, household, and community. We measured pairwise nucleotide differences within hosts and between hosts in household outbreaks and estimated the rate of change in DNA sequences. We used the findings to interpret network diagrams constructed from 11 community clusters derived from mycobacterial interspersed repetitive-unit-variable-number tandem-repeat data. FINDINGS: We sequenced 390 separate isolates from 254 patients, including representatives from all five major lineages of M tuberculosis. The estimated rate of change in DNA sequences was 0.5 single nucleotide polymorphisms (SNPs) per genome per year (95% CI 0.3-0.7) in longitudinal isolates from 30 individuals and 25 families. Divergence is rarely higher than five SNPs in 3 years. 109 (96%) of 114 paired isolates from individuals and households differed by five or fewer SNPs. More than five SNPs separated isolates from none of 69 epidemiologically linked patients, two (15%) of 13 possibly linked patients, and 13 (17%) of 75 epidemiologically unlinked patients (three-way comparison exact p<0.0001). Genetic trees and clinical and epidemiological data suggest that super-spreaders were present in two community clusters. INTERPRETATION: Whole-genome sequencing can delineate outbreaks of tuberculosis and allows inference about direction of transmission between cases. The technique could identify super-spreaders and predict the existence of undiagnosed cases, potentially leading to early treatment of infectious patients and their contacts. FUNDING: Medical Research Council, Wellcome Trust, National Institute for Health Research, and the Health Protection Agency.

Fillekes Q, Mulenga V, Kabamba D, Kankasa C, Thomason MJ, Cook A, Ferrier A, Chintu C, Walker AS, Gibb DM, Burger DM. 2012. Pharmacokinetics of nevirapine in HIV-infected infants weighing 3 kg to less than 6 kg taking paediatric fixed dose combination tablets. AIDS, 26 (14), pp. 1795-1800. | Show Abstract | Read more

OBJECTIVES: To evaluate pharmacokinetics of nevirapine, lamivudine and stavudine in HIV-infected Zambian infants receiving fixed dose combination (FDC) antiretroviral tablets (Triomune Baby). DESIGN: Phase I/II study. METHODS: Sixteen HIV-infected children at least 1 month, weighing 3 kg to less than 6 kg were enrolled. Blood was sampled at t = 0, 2, 6 and 12 h after observed intake of one FDC tablet (50 mg nevirapine, 6 mg stavudine, 30 mg lamivudine) 4 weeks after starting treatment. Safety and viral load response over 48 weeks were determined. RESULTS: The median [interquartile range (IQR)] age, body weight and daily nevirapine dose in 15 included children (eight girls) were 4.8 (4.2, 8.4) months, 5.3 (4.3, 5.5) kg and 348 (326 385) mg/m, respectively. The median (IQR) nevirapine area under the concentration-time curve (AUC0-12 h), Cmax and C12 h were 70 (56, 104) h mg/l, 7.5 (6.2, 10) mg/l, and 4.3 (2.9, 6.9) mg/l, respectively. Values were on average higher than reported in adults, but approximately 20% lower than previously reported in children weighing at least 6 kg. Four of 15 (27%) children had a subtherapeutic nevirapine C12 h (defined as <3.0 mg/l) compared to only three of 63 (5%) children weighing at least 6 kg (P = 0.02), whereas children aged less than 5 months [three of six (50%)] may have the highest risk for subtherapeutic nevirapine C12 h (P = 0.24). No association was found between viral load values and nevirapine plasma pharmacokinetic parameters (P > 0.3). Stavudine-lamivudine pharmacokinetic parameters were broadly comparable to heavier children. CONCLUSION: Exposure to nevirapine in African, HIV-infected infants with low body weight taking FDC tablets appears on average to be adequate, but due to large intersubject variability a relatively high proportion had subtherapeutic nevirapine C12 h levels, particularly those aged less than 5 months.

Lewis J, Walker AS, Castro H, De Rossi A, Gibb DM, Giaquinto C, Klein N, Callard R. 2012. Naive and Memory CD4(+) T Cells in HIV Eradication and Immunization Reply JOURNAL OF INFECTIOUS DISEASES, 206 (4), pp. 618-618. | Read more

Crook DW, Walker AS, Kean Y, Weiss K, Cornely OA, Miller MA, Esposito R, Louie TJ, Stoesser NE, Young BC et al. 2012. Fidaxomicin versus vancomycin for Clostridium difficile infection: meta-analysis of pivotal randomized controlled trials. Clin Infect Dis, 55 Suppl 2 (suppl_2), pp. S93-103. | Show Abstract | Read more

Two recently completed phase 3 trials (003 and 004) showed fidaxomicin to be noninferior to vancomycin for curing Clostridium difficile infection (CDI) and superior for reducing CDI recurrences. In both studies, adults with active CDI were randomized to receive blinded fidaxomicin 200 mg twice daily or vancomycin 125 mg 4 times a day for 10 days. Post hoc exploratory intent-to-treat (ITT) time-to-event analyses were undertaken on the combined study 003 and 004 data, using fixed-effects meta-analysis and Cox regression models. ITT analysis of the combined 003/004 data for 1164 patients showed that fidaxomicin reduced persistent diarrhea, recurrence, or death by 40% (95% confidence interval [CI], 26%-51%; P < .0001) compared with vancomycin through day 40. A 37% (95% CI, 2%-60%; P = .037) reduction in persistent diarrhea or death was evident through day 12 (heterogeneity P = .50 vs 13-40 days), driven by 7 (1.2%) fidaxomicin versus 17 (2.9%) vancomycin deaths at <12 days. Low albumin level, low eosinophil count, and CDI treatment preenrollment were risk factors for persistent diarrhea or death at 12 days, and CDI in the previous 3 months was a risk factor for recurrence (all P < .01). Fidaxomicin has the potential to substantially improve outcomes from CDI.

Eyre DW, Walker AS, Wyllie D, Dingle KE, Griffiths D, Finney J, O'Connor L, Vaughan A, Crook DW, Wilcox MH et al. 2012. Predictors of first recurrence of Clostridium difficile infection: implications for initial management. Clin Infect Dis, 55 Suppl 2 (suppl 2), pp. S77-S87. | Show Abstract | Read more

Symptomatic recurrence of Clostridium difficile infection (CDI) occurs in approximately 20% of patients and is challenging to treat. Identifying those at high risk could allow targeted initial management and improve outcomes. Adult toxin enzyme immunoassay-positive CDI cases in a population of approximately 600,000 persons from September 2006 through December 2010 were combined with epidemiological/clinical data. The cumulative incidence of recurrence ≥ 14 days after the diagnosis and/or onset of first-ever CDI was estimated, treating death without recurrence as a competing risk, and predictors were identified from cause-specific proportional hazards regression models. A total of 1678 adults alive 14 days after their first CDI were included; median age was 77 years, and 1191 (78%) were inpatients. Of these, 363 (22%) experienced a recurrence ≥ 14 days after their first CDI, and 594 (35%) died without recurrence through March 2011. Recurrence risk was independently and significantly higher among patients admitted as emergencies, with previous gastrointestinal ward admission(s), last discharged 4-12 weeks before first diagnosis, and with CDI diagnosed at admission. Recurrence risk also increased with increasing age, previous total hours admitted, and C-reactive protein level at first CDI (all P < .05). The 4-month recurrence risk increased by approximately 5% (absolute) for every 1-point increase in a risk score based on these factors. Risk factors, including increasing age, initial disease severity, and hospital exposure, predict CDI recurrence and identify patients likely to benefit from enhanced initial CDI treatment.

Oudijk JM, McIlleron H, Mulenga V, Chintu C, Merry C, Walker AS, Cook A, Gibb DM, Burger DM. 2012. Pharmacokinetics of nevirapine in HIV-infected children under 3 years on rifampicin-based antituberculosis treatment. AIDS, 26 (12), pp. 1523-1528. | Show Abstract | Read more

OBJECTIVE: There is an urgent need to optimize cotreatment for children with tuberculosis and HIV infection. We described nevirapine pharmacokinetics in Zambian children aged less than 3 years, cotreated with nevirapine, lamivudine and stavudine in fixed-dose combination (using WHO weight bands) and rifampicin-based antituberculosis treatment. DESIGN: Twenty-two children received antituberculosis and antiretroviral therapy (ART) concurrently for 4 weeks before pharmacokinetic sampling. Plasma nevirapine concentrations were determined in samples taken immediately before, and 1, 2 and 6 h after an observed dose. Nevirapine pharmacokinetics were compared with those in 16 children aged less than 3 years without tuberculosis. RESULTS: Twenty-two children were treated for HIV/TB coinfection, 10 of whom were girls. One boy was excluded from analysis for nonadherence. The median age was 1.6 years (range: 0.7-3.2). Median weight was 8.0 kg (range: 5.1-10.5). The baseline CD4% was 13.1 (range: 3.9-43.6). Median predose concentration of nevirapine was 2.93 mg/l (range: 1.06-11.4), and peak concentration was 6.33 mg/l (range: 2.61-14.5). The nevirapine AUC up to 12 h was estimated as 52.0 mg.h/l (range: 22.6-159.7) compared with 90.9 mg.h/l (range: 40.4-232.1) in children without tuberculosis (P < 0.001). Predose concentrations of nevirapine were less than 3.0 mg/l in 11 children on tuberculosis treatment versus none of the 16 children without tuberculosis treatment (P = 0.001). AUC was 41% (95% CI: 23-54%) lower in children with tuberculosis than without tuberculosis (P < 0.001) after adjusting for dose per square meter. CONCLUSION: : We found substantial reductions in nevirapine concentrations in young children receiving rifampicin. Further studies are needed to define the pharmacokinetics, safety and efficacy of adjusted doses of nevirapine-based ART in young children with tuberculosis.

Lewis J, Walker AS, Klein N, Callard R. 2012. CD31+ cell percentage correlation with speed of CD4+ T-cell count recovery in HIV-infected adults is reversed in children: higher thymic output may be responsible. Clin Infect Dis, 55 (2), pp. 304-307. | Read more

Schlackow I, Walker AS, Dingle K, Griffiths D, Oakley S, Finney J, Vaughan A, Gill MJ, Crook DW, Peto TEA, Wyllie DH. 2012. Surveillance of infection severity: a registry study of laboratory diagnosed Clostridium difficile. PLoS Med, 9 (7), pp. e1001279. | Show Abstract | Read more

BACKGROUND: Changing clinical impact, as virulent clones replace less virulent ones, is a feature of many pathogenic bacterial species and can be difficult to detect. Consequently, innovative techniques monitoring infection severity are of potential clinical value. METHODS AND FINDINGS: We studied 5,551 toxin-positive and 20,098 persistently toxin-negative patients tested for Clostridium difficile infection between February 1998 and July 2009 in a group of hospitals based in Oxford, UK, and investigated 28-day mortality and biomarkers of inflammation (blood neutrophil count, urea, and creatinine concentrations) collected at diagnosis using iterative sequential regression (ISR), a novel joinpoint-based regression technique suitable for serial monitoring of continuous or dichotomous outcomes. Among C. difficile toxin-positive patients in the Oxford hospitals, mean neutrophil counts on diagnosis increased from 2003, peaked in 2006-2007, and then declined; 28-day mortality increased from early 2006, peaked in late 2006-2007, and then declined. Molecular typing confirmed these changes were likely due to the ingress of the globally distributed severe C. difficile strain, ST1. We assessed the generalizability of ISR-based severity monitoring in three ways. First, we assessed and found strong (p<0.0001) associations between isolation of the ST1 severe strain and higher neutrophil counts at diagnosis in two unrelated large multi-centre studies, suggesting the technique described might be useful elsewhere. Second, we assessed and found similar trends in a second group of hospitals in Birmingham, UK, from which 5,399 cases were analysed. Third, we used simulation to assess the performance of this surveillance system given the ingress of future severe strains under a variety of assumptions. ISR-based severity monitoring allowed the detection of the severity change years earlier than mortality monitoring. CONCLUSIONS: Automated electronic systems providing early warning of the changing severity of infectious conditions can be established using routinely collected laboratory hospital data. In the settings studied here these systems have higher performance than those monitoring mortality, at least in C. difficile infection. Such systems could have wider applicability for monitoring infections presenting in hospital.

Eyre DW, Golubchik T, Gordon NC, Bowden R, Piazza P, Batty EM, Ip CLC, Wilson DJ, Didelot X, O'Connor L et al. 2012. A pilot study of rapid benchtop sequencing of Staphylococcus aureus and Clostridium difficile for outbreak detection and surveillance. BMJ Open, 2 (3), pp. e001124-e001124. | Show Abstract | Read more

OBJECTIVES: To investigate the prospects of newly available benchtop sequencers to provide rapid whole-genome data in routine clinical practice. Next-generation sequencing has the potential to resolve uncertainties surrounding the route and timing of person-to-person transmission of healthcare-associated infection, which has been a major impediment to optimal management. DESIGN: The authors used Illumina MiSeq benchtop sequencing to undertake case studies investigating potential outbreaks of methicillin-resistant Staphylococcus aureus (MRSA) and Clostridium difficile. SETTING: Isolates were obtained from potential outbreaks associated with three UK hospitals. PARTICIPANTS: Isolates were sequenced from a cluster of eight MRSA carriers and an associated bacteraemia case in an intensive care unit, another MRSA cluster of six cases and two clusters of C difficile. Additionally, all C difficile isolates from cases over 6 weeks in a single hospital were rapidly sequenced and compared with local strain sequences obtained in the preceding 3 years. MAIN OUTCOME MEASURE: Whole-genome genetic relatedness of the isolates within each epidemiological cluster. RESULTS: Twenty-six MRSA and 15 C difficile isolates were successfully sequenced and analysed within 5 days of culture. Both MRSA clusters were identified as outbreaks, with most sequences in each cluster indistinguishable and all within three single nucleotide variants (SNVs). Epidemiologically unrelated isolates of the same spa-type were genetically distinct (≥21 SNVs). In both C difficile clusters, closely epidemiologically linked cases (in one case sharing the same strain type) were shown to be genetically distinct (≥144 SNVs). A reconstruction applying rapid sequencing in C difficile surveillance provided early outbreak detection and identified previously undetected probable community transmission. CONCLUSIONS: This benchtop sequencing technology is widely generalisable to human bacterial pathogens. The findings provide several good examples of how rapid and precise sequencing could transform identification of transmission of healthcare-associated infection and therefore improve hospital infection control and patient outcomes in routine clinical practice.

Donegan KL, Walker AS, Dunn D, Judd A, Pillay D, Menson E, Lyall H, Tudor-Williams G, Gibb DM, Collaborative HIV Paediatric Study, UK HIV Drug Resistance Database. 2012. The prevalence of darunavir-associated mutations in HIV-1-infected children in the UK. Antivir Ther, 17 (4), pp. 599-603. | Show Abstract | Read more

BACKGROUND: We examined the prevalence of ritonavir-boosted darunavir (DRV) resistance-associated mutations (RAMs) in HIV-infected children in the UK to determine the drug's potential clinical utility as a first-line or second-line protease inhibitor (PI). METHODS: The prevalence of DRV RAMs, identified from IAS 2010 and Stanford, and the Stanford susceptibility score, were estimated in PI-naive and PI-experienced children in the Collaborative HIV Paediatric Study and the UK HIV Drug Resistance Database 1998-2008. Associations between type/duration of PI exposure and area under the viraemia curve on PI with the number of RAMs were investigated using multivariate Poisson regression. RESULTS: A total of 17/417 (4%) children with a resistance test when PI-naive had one IAS DRV RAM, and 1 had a Stanford mutation; none had multiple DRV RAMs. A total of 177 PI-experienced children had a test after a median 2.7 years (IQR 1.1-5.2) on PIs; 19 (11%) had one IAS DRV RAM, 7 (4%) had two RAMs, 1 (0.6%) had three RAMs and 1 (0.6%) had four RAMs. DRV RAMs were independently associated with increased years on a PI, a larger area under the viraemia curve since starting PIs, and any exposure to PIs other than lopinavir (all P≤0.05). Only 6 (3%) PI-experienced children had intermediate-level DRV/ritonavir resistance; none had high-level resistance. CONCLUSIONS: DRV resistance was negligible in PI-naive children and those with lopinavir PI exposure alone. However resistance increased with increasing time, and with higher levels of viraemia, on PIs. Once-daily DRV/ritonavir would be valuable as a second PI or an alternative first PI, particularly if coformulated with a booster in an appropriate formulation for children.

Donegan KL, Walker AS, Dunn D, Judd A, Pillay D, Menson E, Lyall H, Tudor-Williams G, Gibb DM. 2012. The prevalence of darunavir-associated mutations in HIV-1-infected children in the UK Antiviral Therapy, 17 (4), pp. 769. | Read more

Gibb DM, Kizito H, Russell EC, Chidziva E, Zalwango E, Nalumenya R, Spyer M, Tumukunde D, Nathoo K, Munderi P et al. 2012. Pregnancy and infant outcomes among HIV-infected women taking long-term ART with and without tenofovir in the DART trial. PLoS Med, 9 (5), pp. e1001217. | Show Abstract | Read more

BACKGROUND: Few data have described long-term outcomes for infants born to HIV-infected African women taking antiretroviral therapy (ART) in pregnancy. This is particularly true for World Health Organization (WHO)-recommended tenofovir-containing first-line regimens, which are increasingly used and known to cause renal and bone toxicities; concerns have been raised about potential toxicity in babies due to in utero tenofovir exposure. METHODS AND FINDINGS: Pregnancy outcome and maternal/infant ART were collected in Ugandan/Zimbabwean HIV-infected women initiating ART during The Development of AntiRetroviral Therapy in Africa (DART) trial, which compared routine laboratory monitoring (CD4; toxicity) versus clinically driven monitoring. Women were followed 15 January 2003 to 28 September 2009. Infant feeding, clinical status, and biochemistry/haematology results were collected in a separate infant study. Effect of in utero ART exposure on infant growth was analysed using random effects models. 382 pregnancies occurred in 302/1,867 (16%) women (4.4/100 woman-years [95% CI 4.0-4.9]). 226/390 (58%) outcomes were live-births, 27 (7%) stillbirths (≥22 wk), and 137 (35%) terminations/miscarriages (<22 wk). Of 226 live-births, seven (3%) infants died <2 wk from perinatal causes and there were seven (3%) congenital abnormalities, with no effect of in utero tenofovir exposure (p>0.4). Of 219 surviving infants, 182 (83%) enrolled in the follow-up study; median (interquartile range [IQR]) age at last visit was 25 (12-38) months. From mothers' ART, 62/9/111 infants had no/20%-89%/≥90% in utero tenofovir exposure; most were also zidovudine/lamivudine exposed. All 172 infants tested were HIV-negative (ten untested). Only 73/182(40%) infants were breast-fed for median 94 (IQR 75-212) days. Overall, 14 infants died at median (IQR) age 9 (3-23) months, giving 5% 12-month mortality; six of 14 were HIV-uninfected; eight untested infants died of respiratory infection (three), sepsis (two), burns (one), measles (one), unknown (one). During follow-up, no bone fractures were reported to have occurred; 12/368 creatinines and seven out of 305 phosphates were grade one (16) or two (three) in 14 children with no effect of in utero tenofovir (p>0.1). There was no evidence that in utero tenofovir affected growth after 2 years (p = 0.38). Attained height- and weight for age were similar to general (HIV-uninfected) Ugandan populations. Study limitations included relatively small size and lack of randomisation to maternal ART regimens. CONCLUSIONS: Overall 1-year 5% infant mortality was similar to the 2%-4% post-neonatal mortality observed in this region. No increase in congenital, renal, or growth abnormalities was observed with in utero tenofovir exposure. Although some infants died untested, absence of recorded HIV infection with combination ART in pregnancy is encouraging. Detailed safety of tenofovir for pre-exposure prophylaxis will need confirmation from longer term follow-up of larger numbers of exposed children. TRIAL REGISTRATION: www.controlled-trials.com ISRCTN13968779

Kiwuwa-Muyingo S, Walker AS, Oja H, Levin J, Miiro G, Katabira E, Kityo C, Hakim J, Todd J, DART Trial Team. 2012. The impact of first year adherence to antiretroviral therapy on long-term clinical and immunological outcomes in the DART trial in Uganda and Zimbabwe. Trop Med Int Health, 17 (5), pp. 584-594. | Show Abstract | Read more

OBJECTIVES: To describe associations between different summaries of adherence in the first year on antiretroviral therapy (ART) and the subsequent risk of mortality, to identify patients at high risk because of early adherence behaviour. METHODS: We previously described an approach where adherence behaviour at successive clinic visits during the first year on ART was seen as a Markov chain (MC), and the individually estimated transition probabilities between 'good', 'poor' and 'non-response' adherence states were used to classify HIV-infected adults in the DART trial into subgroups with similar behaviour. The impact of this classification and classifications based on traditional 'averaged' measures [mean drug possession ratio (DPR) and self-reported adherence] were compared in terms of their impact on longer-term mortality over the 2-5 years on ART using Cox proportional hazards models. RESULTS: Of 2960 participants in follow-up after 1 year on ART, 29% had never missed pills in the last month and 11% had 100% DPR throughout the first year. The poorest adherers by self-reported measures were more likely to have only none/primary education (P < 0.01). Being in the poorest adherence subgroup by MC and DPR was independently associated with increased mortality [HR = 1.57 (95% CI 1.02, 2.42); 1.82 (1.32, 2.51) respectively]. CONCLUSIONS: Classification based on dynamic adherence behaviour is associated with mortality independently of DPR. The classifications could be useful in understanding adherence, targeting focused interventions and improving longer-term adherence to therapy.

Medina Lara A, Kigozi J, Amurwon J, Muchabaiwa L, Nyanzi Wakaholi B, Mujica Mota RE, Walker AS, Kasirye R, Ssali F, Reid A et al. 2012. Cost effectiveness analysis of clinically driven versus routine laboratory monitoring of antiretroviral therapy in Uganda and Zimbabwe. PLoS One, 7 (4), pp. e33672. | Show Abstract | Read more

BACKGROUND: Despite funding constraints for treatment programmes in Africa, the costs and economic consequences of routine laboratory monitoring for efficacy and toxicity of antiretroviral therapy (ART) have rarely been evaluated. METHODS: Cost-effectiveness analysis was conducted in the DART trial (ISRCTN13968779). Adults in Uganda/Zimbabwe starting ART were randomised to clinically-driven monitoring (CDM) or laboratory and clinical monitoring (LCM); individual patient data on healthcare resource utilisation and outcomes were valued with primary economic costs and utilities. Total costs of first/second-line ART, routine 12-weekly CD4 and biochemistry/haematology tests, additional diagnostic investigations, clinic visits, concomitant medications and hospitalisations were considered from the public healthcare sector perspective. A Markov model was used to extrapolate costs and benefits 20 years beyond the trial. RESULTS: 3316 (1660LCM;1656CDM) symptomatic, immunosuppressed ART-naive adults (median (IQR) age 37 (32,42); CD4 86 (31,139) cells/mm(3)) were followed for median 4.9 years. LCM had a mean 0.112 year (41 days) survival benefit at an additional mean cost of $765 [95%CI:685,845], translating into an adjusted incremental cost of $7386 [3277,dominated] per life-year gained and $7793 [4442,39179] per quality-adjusted life year gained. Routine toxicity tests were prominent cost-drivers and had no benefit. With 12-weekly CD4 monitoring from year 2 on ART, low-cost second-line ART, but without toxicity monitoring, CD4 test costs need to fall below $3.78 to become cost-effective (<3xper-capita GDP, following WHO benchmarks). CD4 monitoring at current costs as undertaken in DART was not cost-effective in the long-term. CONCLUSIONS: There is no rationale for routine toxicity monitoring, which did not affect outcomes and was costly. Even though beneficial, there is little justification for routine 12-weekly CD4 monitoring of ART at current test costs in low-income African countries. CD4 monitoring, restricted to the second year on ART onwards, could be cost-effective with lower cost second-line therapy and development of a cheaper, ideally point-of-care, CD4 test.

Schlackow I, Stoesser N, Walker AS, Crook DW, Peto TEA, Wyllie DH, Infections in Oxfordshire Research Database Team. 2012. Increasing incidence of Escherichia coli bacteraemia is driven by an increase in antibiotic-resistant isolates: electronic database study in Oxfordshire 1999-2011. J Antimicrob Chemother, 67 (6), pp. 1514-1524. | Show Abstract | Read more

OBJECTIVES: To investigate trends in Escherichia coli resistance, bacteraemia rates and post-bacteraemia outcomes over time. METHODS: Trends in E. coli bacteraemia incidence were monitored from January 1999 to June 2011 using an infection surveillance database including microbiological, clinical risk factor, infection severity and outcome data in Oxfordshire, UK, with imported temperature/rainfall data. RESULTS: A total of 2240 E. coli (from 2080 patients) were studied, of which 1728 (77%) were susceptible to co-amoxiclav, cefotaxime, ciprofloxacin and gentamicin. E. coli bacteraemia incidence increased from 3.4/10,000 bedstays in 1999 to 5.7/10,000 bedstays in 2011. The increase was fastest around 2006, and was essentially confined to organisms resistant to ciprofloxacin, co-amoxiclav, cefotaxime and/or aminoglycosides. Resistant E. coli isolation rates increased similarly in those with and without recent hospital contact. The sharp increase also occurred in urinary isolates, with similar timing. In addition to these long-term trends, increases in ambient temperature, but not rainfall, were associated with increased E. coli bacteraemia rates. It is unclear whether resistant E. coli bacteraemia rates are currently still increasing [incidence rate ratio = 1.07 per annum (95% CI = 0.99-1.16), P = 0.07], whereas current susceptible E. coli bacteraemia rates are not changing significantly [incidence rate ratio = 1.01 (95% CI = 0.99-1.02)]. However, neither mortality nor biomarkers associated with mortality (blood creatinine, urea/albumin concentrations, neutrophil counts) changed during the study. CONCLUSIONS: E. coli bacteraemia rates have risen due to rising rates of resistant organisms; little change occurred in susceptible E. coli. Although the severity of resistant infections, and their outcome, appear similar to susceptible E. coli in the setting studied, the increasing burden of highly resistant organisms is alarming and merits on-going surveillance.

Muro EP, Fillekes Q, Kisanga ER, L'homme R, Aitken SC, Mariki G, Van der Ven AJAM, Dolmans W, Schuurman R, Walker AS et al. 2012. Intrapartum single-dose carbamazepine reduces nevirapine levels faster and may decrease resistance after a single dose of nevirapine for perinatal HIV prevention. J Acquir Immune Defic Syndr, 59 (3), pp. 266-273. | Show Abstract | Read more

BACKGROUND: World Health Organization guidelines recommend zidovudine + lamivudine for 7 days from labor onset in HIV-infected women receiving single-dose nevirapine (sdNVP) to cover prolonged subtherapeutic nevirapine concentrations. Although effective, this is complicated and does not eliminate resistance; alternative strategies could add benefit. METHODS: Antiretroviral-naive HIV-infected pregnant women aged 18-40 years, with CD4 >200 cells per cubic millimeter, able to regularly attend the antenatal clinics in Moshi, Tanzania, were enrolled 1:1 by alternate allocation to receive 200 mg sdNVP alone or in combination with open-label 400-mg single-dose carbamazepine (sdNVP/CBZ) at delivery (ClinicalTrials.gov NCT00294892). The coprimary outcomes were nevirapine plasma concentrations 1 week and nevirapine resistance mutations 6 weeks postpartum. Analyses were based on those still eligible at delivery. RESULTS: Ninety-seven women were assigned to sdNVP and 95 to sdNVP/CBZ during pregnancy, of whom 75 sdNVP and 83 sdNVP/CBZ were still eligible at delivery at study sites. The median (interquartile range) nevirapine plasma concentration was 1.55 (0.88-1.84) mg/L in sdNVP (n = 61) and 1.40 (0.93-1.97) mg/L in sdNVP/CBZ (n = 72) at delivery (P = 0.91), but 1 week later was significantly lower in sdNVP/CBZ [n = 63; 0.09 (0.05-0.20) mg/L] than in sdNVP [n = 52; 0.20 (0.09-0.31) mg/L; rank-sum: P = 0.004] (geometric mean ratio: 0.64, 95% confidence interval: 0.43 to 0.96; P = 0.03). Six weeks postpartum, nevirapine mutations were observed in 11 of 52 (21%) in sdNVP and 6 of 55 (11%) in sdNVP/CBZ (odds ratio = 0.46, 95% confidence interval: 0.16 to 1.34; P = 0.15). CONCLUSIONS: Addition of single-dose carbamazepine to sdNVP at labor onset in HIV-infected, pregnant women did not affect nevirapine plasma concentration at delivery, but significantly reduced it 1 week postpartum, with a trend toward fewer nevirapine resistance mutations.

Lewis J, Walker AS, Castro H, De Rossi A, Gibb DM, Giaquinto C, Klein N, Callard R. 2012. Age and CD4 count at initiation of antiretroviral therapy in HIV-infected children: effects on long-term T-cell reconstitution. J Infect Dis, 205 (4), pp. 548-556. | Show Abstract | Read more

BACKGROUND: Effective therapies and reduced AIDS-related morbidity and mortality have shifted the focus in pediatric human immunodeficiency virus (HIV) from minimizing short-term disease progression to maintaining optimal long-term health. We describe the effects of children's age and pre-antiretroviral therapy (ART) CD4 count on long-term CD4 T-cell reconstitution. METHODS: CD4 counts in perinatally HIV-infected, therapy-naive children in the Paediatric European Network for the Treatment of AIDS 5 trial were monitored following initiation of ART for a median 5.7 years. In a substudy, naive and memory CD4 counts were recorded. Age-standardized measurements were analyzed using monophasic, asymptotic nonlinear mixed-effects models. RESULTS: One hundred twenty-seven children were studied. Older children had lower age-adjusted CD4 counts in the long term and at treatment initiation (P < .001). At all ages, lower counts before treatment were associated with impaired recovery (P < .001). Age-adjusted naive CD4 counts increased on a timescale comparable to overall CD4 T-cell reconstitution, whereas age-adjusted memory CD4 counts increased less, albeit on a faster timescale. CONCLUSIONS: It appears the immature immune system can recover well from HIV infection via the naive pool. However, this potential is progressively damaged with age and/or duration of infection. Current guidelines may therefore not optimize long-term immunological health.

Kasirye P, Kendall L, Adkison KK, Tumusiime C, Ssenyonga M, Bakeera-Kitaka S, Nahirya-Ntege P, Mhute T, Kekitiinwa A, Snowden W et al. 2012. Pharmacokinetics of antiretroviral drug varies with formulation in the target population of children with HIV-1. Clin Pharmacol Ther, 91 (2), pp. 272-280. | Show Abstract | Read more

The bioequivalence of formulations is usually evaluated in healthy adult volunteers. In our study in 19 HIV-1-infected Ugandan children (1.8-4 years of age, weight 12 to <15 kg) receiving zidovudine, lamivudine, and abacavir solutions twice a day for ≥24 weeks, the use of scored tablets allowed comparison of plasma pharmacokinetics of oral solutions vs. tablets. Samples were collected 0, 1, 2, 4, 6, 8, and 12 h after each child's last morning dose of oral solution before changing to scored tablets of Combivir (coformulated zidovudine + lamivudine) and abacavir; this was repeated 4 weeks later. Dose-normalized area under curve (AUC)(0-12) and peak concentration (C(max)) for the tablet formulation were bioequivalent with those of the oral solution with respect to zidovudine and abacavir (e.g., dose-normalized geometric mean ratio (dnGMR) (tablet:solution) for zidovudine and abacavir AUC(0-12) were 1.01 (90% confidence interval (CI) 0.87-1.18) and 0.96 (0.83-1.12), respectively). However, lamivudine exposure was ~55% higher with the tablet formulation (AUC(0-12) dnGMR = 1.58 (1.37-1.81), C(max) dnGMR = 1.55 (1.33-1.81)). Although the clinical relevance of this finding is unclear, it highlights the impact of the formulation and the importance of conducting bioequivalence studies in target pediatric populations.

Walker AS, Eyre DW, Wyllie DH, Dingle KE, Harding RM, O'Connor L, Griffiths D, Vaughan A, Finney J, Wilcox MH et al. 2012. Characterisation of Clostridium difficile hospital ward-based transmission using extensive epidemiological data and molecular typing. PLoS Med, 9 (2), pp. e1001172. | Show Abstract | Read more

BACKGROUND: Clostridium difficile infection (CDI) is a leading cause of antibiotic-associated diarrhoea and is endemic in hospitals, hindering the identification of sources and routes of transmission based on shared time and space alone. This may compromise rational control despite costly prevention strategies. This study aimed to investigate ward-based transmission of C. difficile, by subdividing outbreaks into distinct lineages defined by multi-locus sequence typing (MLST). METHODS AND FINDINGS: All C. difficile toxin enzyme-immunoassay-positive and culture-positive samples over 2.5 y from a geographically defined population of ~600,000 persons underwent MLST. Sequence types (STs) were combined with admission and ward movement data from an integrated comprehensive healthcare system incorporating three hospitals (1,700 beds) providing all acute care for the defined geographical population. Networks of cases and potential transmission events were constructed for each ST. Potential infection sources for each case and transmission timescales were defined by prior ward-based contact with other cases sharing the same ST. From 1 September 2007 to 31 March 2010, there were means of 102 tests and 9.4 CDIs per 10,000 overnight stays in inpatients, and 238 tests and 15.7 CDIs per month in outpatients/primary care. In total, 1,276 C. difficile isolates of 69 STs were studied. From MLST, no more than 25% of cases could be linked to a potential ward-based inpatient source, ranging from 37% in renal/transplant, 29% in haematology/oncology, and 28% in acute/elderly medicine to 6% in specialist surgery. Most of the putative transmissions identified occurred shortly (≤ 1 wk) after the onset of symptoms (141/218, 65%), with few >8 wk (21/218, 10%). Most incubation periods were ≤ 4 wk (132/218, 61%), with few >12 wk (28/218, 13%). Allowing for persistent ward contamination following ward discharge of a CDI case did not increase the proportion of linked cases after allowing for random meeting of matched controls. CONCLUSIONS: In an endemic setting with well-implemented infection control measures, ward-based contact with symptomatic enzyme-immunoassay-positive patients cannot account for most new CDI cases.

Carswell C, Rañopa M, Pal S, Macfarlane R, Siddique D, Thomas D, Webb T, Wroe S, Walker S, Darbyshire J et al. 2012. Video Rating in Neurodegenerative Disease Clinical Trials: The Experience of PRION-1. Dement Geriatr Cogn Dis Extra, 2 (1), pp. 286-297. | Show Abstract | Read more

BACKGROUND/AIMS: Large clinical trials including patients with uncommon diseases involve assessors in different geographical locations, resulting in considerable inter-rater variability in assessment scores. As video recordings of examinations, which can be individually rated, may eliminate such variability, we measured the agreement between a single video rater and multiple examining physicians in the context of PRION-1, a clinical trial of the antimalarial drug quinacrine in human prion diseases. METHODS: We analysed a 43-component neurocognitive assessment battery, on 101 patients with Creutzfeldt-Jakob disease, focusing on the correlation and agreement between examining physicians and a single video rater. RESULTS: In total, 335 videos of examinations of 101 patients who were video-recorded over the 4-year trial period were assessed. For neurocognitive examination, inter-observer concordance was generally excellent. Highly visual neurological examination domains (e.g. finger-nose-finger assessment of ataxia) had good inter-rater correlation, whereas those dependent on non-visual clues (e.g. power or reflexes) correlated poorly. Some non-visual neurological domains were surprisingly concordant, such as limb muscle tone. CONCLUSION: Cognitive assessments and selected neurological domains can be practically and accurately recorded in a clinical trial using video rating. Video recording of examinations is a valuable addition to any trial provided appropriate selection of assessment instruments is used and rigorous training of assessors is undertaken.

Fillekes Q, Natukunda E, Balungi J, Kendall L, Bwakura-Dangarembizi M, Keishanyu R, Ferrier A, Lutakome J, Gibb DM, Burger DM et al. 2011. Pediatric underdosing of efavirenz: a pharmacokinetic study in Uganda. J Acquir Immune Defic Syndr, 58 (4), pp. 392-398. | Show Abstract | Read more

OBJECTIVES: To evaluate international pediatric efavirenz dosing recommendations using full pharmacokinetic (PK) information. DESIGN: Open-label, multicenter, PK study. METHODS: Forty-one HIV-infected Ugandan children (3-12 years) on efavirenz + lamivudine + abacavir were enrolled in a study of twice-daily to once-daily lamivudine + abacavir 36 weeks after antiretroviral therapy initiation in the ARROW trial. Once-daily efavirenz doses were 200, 250, 300, 350 mg for children weighing 10 to <15, 15 to <20, 20 to <25, 25 to <30 kg, respectively, using 200/50 mg capsules or halved 600 mg tablets in case of 300 and 350 mg doses. Intensive plasma PK sampling (t = 0, 1, 2, 4, 6, 8, 12 hours postobserved ingestion) was performed at steady state (PK1) and repeated 4 weeks later (PK2, including a further 24-hour sample). RESULTS: Forty-one and 39 children had evaluable efavirenz profiles at PK1 and PK2, respectively. Seventeen (41%) were boys. Five, 16, 17, 3 were in the 10 to <15, 15 to <20, 20 to <25, 25 to <30 kg weight bands. The geometric mean (%CV) the area under the concentration-time curve 0-24 hours postdose was 50.8 (90.8%) and 55.5 (82.7%) h·mg·L(-1) at PK1 and PK2, respectively. Six children at PK1 and 7 at PK2 had subtherapeutic C(8h) and/or C(12h) (<1.0 mg/L), 7 of 41 (17%) at either visit. At PK2, 15 of 39 (38%) children had C(24h) <1.0 mg/L (median (interquartile range) [range] 1.1 (0.7-2.9) [0.3-18.4]). Ten children at PK1 and 11 at PK2 had C(8h) and/or C(12h) >4.0 mg/L; 12 of 41 (29%) at either visit. CONCLUSIONS: African children aged 3-12 years, on efavirenz dosed according to 2006 WHO/manufacturer's recommendations, had lower and highly variable efavirenz PK parameters compared with adult data from manufacturer's leaflet. There were no differences across weight bands, suggesting no major effect of using half tablets. Higher pediatric efavirenz doses, as per WHO 2010 recommendations, should be used and investigated further but may risk increasing the proportion of children with potentially toxic levels.

Mercer RM, Olarinde O, Ryan C, Greig J, Yeeles H, Walker A, Walker H, Meardon NC. 2011. Initiation of antiretroviral therapy in patients with a CD4 count of less than 350 cells: a retrospective audit against key indicators from the CQUIN payment framework. Int J STD AIDS, 22 (12), pp. 755-756. | Show Abstract | Read more

A proportion of funding for the South Yorkshire HIV Network is dependant on meeting the targets of the Commissioning for Quality and Innovation (CQUIN) payment framework. This states that 85% of patients with a CD4 count below 350 should be on antiretroviral therapy (ART). We also audited how many patients we started on treatment within six weeks. We found 88% of the 243 patients were on ART at the end of the audit, but significantly less had been started on treatment within six weeks of their CD4 count falling below 350. Although the target was achieved, there were patients who should be excluded as shown by other clinical guidelines, for example patients on treatment for tuberculosis. If these patients were excluded and the threshold level increased, it would help emphasize the at-risk patient group and lead to a fairer allocation of funding.

Eyre DW, Walker AS, Griffiths D, Wilcox MH, Wyllie DH, Dingle KE, Crook DW, Peto TEA. 2012. Clostridium difficile mixed infection and reinfection. J Clin Microbiol, 50 (1), pp. 142-144. | Show Abstract | Read more

Isolates from consecutive Clostridium difficile infection (CDI) fecal samples underwent multilocus sequence typing. Potential reinfections with different genotypes were identified in 88/560 (16%) sample pairs taken 1 to 1,414 days (median, 24; interquartile range [IQR], 1 to 52 days) apart; odds of reinfection increased by 58% for every doubling of time between samples. Of 109 sample pairs taken on the same day, 3 (3%) had different genotypes. Considering samples 0 to 7 days apart as the same CDI, 7% of cases had mixed infections with >1 genotype.

Stöhr W, Reid A, Walker AS, Ssali F, Munderi P, Mambule I, Kityo C, Grosskurth H, Gilks CF, Gibb DM et al. 2011. Glomerular dysfunction and associated risk factors over 4-5 years following antiretroviral therapy initiation in Africa. Antivir Ther, 16 (7), pp. 1011-1020. | Show Abstract | Read more

BACKGROUND: The aim of this study was to investigate long-term renal function in HIV-infected adults initiating antiretroviral therapy (ART) with a CD4(+) T-cell count < 200 cells/mm³ in Africa. METHODS: This was an observational analysis within the DART trial randomizing 3,316 adults to routine laboratory and clinical monitoring (LCM) or clinically driven monitoring (CDM). Serum creatinine was measured pre-ART (all ≤ 360 μmol/l), at weeks 4 and 12, then every 12 weeks for 4-5 years; estimated glomerular filtration rate (eGFR) was determined using the Cockcroft-Gault formula. We analysed eGFR changes, and cumulative incidences of eGFR< 30 ml/min/1.73 m² and chronic kidney disease (CKD; <60 ml/min/1.73 m² or 25% decrease if <60 ml/min/1.73 m² pre-ART; confirmed >3 months). RESULTS: At ART initiation, median CD4(+) T-cell count was 86 cells/mm³; 1,492 (45%) participants had mild (60-< 90 ml/min/1.73 m²), 237 (7%) moderate (30-<60 ml/min/1.73 m² and 7 (0.2%) severe (15-<30 ml/min/1.73 m²) decreases in eGFR. First-line ART was zidovudine/lamivudine plus tenofovir (74%), abacavir (9%) or nevirapine (17%). By 4 years, cumulative incidence of eGFR<30 ml/min/1.73 m² was 2.8% (n=90) and CKD was 5.0% (n=162). Adjusted eGFR increases to 4 years were 1, 9 and 6 ml/min/1.73 m² with tenofovir, abacavir and nevirapine, respectively (P<0.001), and 4 and 2 ml/min/1.73 m² for LCM and CDM, respectively (P=0.005; 2 and 3 ml/min/1.73 m² to 5 years; P=0.81). CONCLUSIONS: On all regimens and monitoring strategies, severe eGFR impairment was infrequent; differences in eGFR changes were small, suggesting that first-line ART, including tenofovir, can be given safely without routine renal function monitoring.

Mead S, Ranopa M, Gopalakrishnan GS, Thompson AGB, Rudge P, Wroe S, Kennedy A, Hudson F, MacKay A, Darbyshire JH et al. 2011. PRION-1 scales analysis supports use of functional outcome measures in prion disease. Neurology, 77 (18), pp. 1674-1683. | Show Abstract | Read more

OBJECTIVES: Human prion diseases are heterogeneous but invariably fatal neurodegenerative disorders with no known effective therapy. PRION-1, the largest clinical trial in prion disease to date, showed no effect of the potential therapeutic quinacrine on survival. Although there are several limitations to the usefulness of survival as an outcome measure, there have been no comprehensive studies of alternatives. METHODS: To address this we did comparative analyses of neurocognitive, psychiatric, global, clinician-rated, and functional scales, focusing on validity, variability, and impact on statistical power over 77 person-years follow-up in 101 symptomatic patients in PRION-1. RESULTS: Quinacrine had no demonstrable benefit on any of the 8 scales (p > 0.4). All scales had substantial numbers of patients with the worst possible score at enrollment (Glasgow Coma Scale score being least affected) and were impacted by missing data due to disease progression. These effects were more significant for cognitive/psychiatric scales than global, clinician-rated, or functional scales. The Barthel and Clinical Dementia Rating scales were the most valid and powerful in simulated clinical trials of an effective therapeutic. A combination of selected subcomponents from these 2 scales gave somewhat increased power, compared to use of survival, to detect clinically relevant effects in future clinical trials of feasible size. CONCLUSIONS: Our findings have implications for the choice of primary outcome measure in prion disease clinical trials. Prion disease presents the unusual opportunity to follow patients with a neurodegenerative disease through their entire clinical course, and this provides insights relevant to designing outcome measures in related conditions.

Stoesser N, Crook DW, Fung R, Griffiths D, Harding RM, Kachrimanidou M, Keshav S, Peto TE, Vaughan A, Walker AS, Dingle KE. 2011. Molecular epidemiology of Clostridium difficile strains in children compared with that of strains circulating in adults with Clostridium difficile-associated infection. J Clin Microbiol, 49 (11), pp. 3994-3996. | Show Abstract | Read more

Molecular analysis of Clostridium difficile (28 isolates) from children (n = 128) in Oxfordshire, United Kingdom, identified eight toxigenic genotypes. Six of these were isolated from 27% of concurrent adult C. difficile-associated infections studied (n = 83). No children carried hypervirulent PCR ribotype 027. Children could participate in the transmission of some adult disease-causing genotypes.

Wyllie DH, Walker AS, Miller R, Moore C, Williamson SR, Schlackow I, Finney JM, O'Connor L, Peto TEA, Crook DW. 2011. Decline of meticillin-resistant Staphylococcus aureus in Oxfordshire hospitals is strain-specific and preceded infection-control intensification. BMJ Open, 1 (1), pp. e000160. | Show Abstract | Read more

Background In the past, strains of Staphylococcus aureus have evolved, expanded, made a marked clinical impact and then disappeared over several years. Faced with rising meticillin-resistant S aureus (MRSA) rates, UK government-supported infection control interventions were rolled out in Oxford Radcliffe Hospitals NHS Trust from 2006 onwards. Methods Using an electronic Database, the authors identified isolation of MRS among 611 434 hospital inpatients admitted to acute hospitals in Oxford, UK, 1 April 1998 to 30 June 2010. Isolation rates were modelled using segmented negative binomial regression for three groups of isolates: from blood cultures, from samples suggesting invasion (eg, cerebrospinal fluid, joint fluid, pus samples) and from surface swabs (eg, from wounds). Findings MRSA isolation rates rose rapidly from 1998 to the end of 2003 (annual increase from blood cultures 23%, 95% CI 16% to 30%), and then declined. The decline accelerated from mid-2006 onwards (annual decrease post-2006 38% from blood cultures, 95% CI 29% to 45%, p=0.003 vs previous decline). Rates of meticillin-sensitive S aureus changed little by comparison, with no evidence for declines 2006 onward (p=0.40); by 2010, sensitive S aureus was far more common than MRSA (blood cultures: 2.9 vs 0.25; invasive samples 14.7 vs 2.0 per 10 000 bedstays). Interestingly, trends in isolation of erythromycin-sensitive and resistant MRSA differed. Erythromycin-sensitive strains rose significantly faster (eg, from blood cultures p=0.002), and declined significantly more slowly (p=0.002), than erythromycin-resistant strains (global p<0.0001). Bacterial typing suggests this reflects differential spread of two major UK MRSA strains (ST22/36), ST36 having declined markedly 2006-2010, with ST22 becoming the dominant MRSA strain. Conclusions MRSA isolation rates were falling before recent intensification of infection-control measures. This, together with strain-specific changes in MRSA isolation, strongly suggests that incompletely understood biological factors are responsible for the much recent variation in MRSA isolation. A major, mainly meticillin-sensitive, S aureus burden remains.

Dingle KE, Griffiths D, Didelot X, Evans J, Vaughan A, Kachrimanidou M, Stoesser N, Jolley KA, Golubchik T, Harding RM et al. 2011. Clinical Clostridium difficile: clonality and pathogenicity locus diversity. PLoS One, 6 (5), pp. e19993. | Show Abstract | Read more

Clostridium difficile infection (CDI) is an important cause of mortality and morbidity in healthcare settings. The major virulence determinants are large clostridial toxins, toxin A (tcdA) and toxin B (tcdB), encoded within the pathogenicity locus (PaLoc). Isolates vary in pathogenicity from hypervirulent PCR-ribotypes 027 and 078 with high mortality, to benign non-toxigenic strains carried asymptomatically. The relative pathogenicity of most toxigenic genotypes is still unclear, but may be influenced by PaLoc genetic variant. This is the largest study of C. difficile molecular epidemiology performed to date, in which a representative collection of recent isolates (n = 1290) from patients with CDI in Oxfordshire, UK, was genotyped by multilocus sequence typing. The population structure was described using NeighborNet and ClonalFrame. Sequence variation within toxin B (tcdB) and its negative regulator (tcdC), was mapped onto the population structure. The 69 Sequence Types (ST) showed evidence for homologous recombination with an effect on genetic diversification four times lower than mutation. Five previously recognised genetic groups or clades persisted, designated 1 to 5, each having a strikingly congruent association with tcdB and tcdC variants. Hypervirulent ST-11 (078) was the only member of clade 5, which was divergent from the other four clades within the MLST loci. However, it was closely related to the other clades within the tcdB and tcdC loci. ST-11 (078) may represent a divergent formerly non-toxigenic strain that acquired the PaLoc (at least) by genetic recombination. This study focused on human clinical isolates collected from a single geographic location, to achieve a uniquely high density of sampling. It sets a baseline of MLST data for future comparative studies investigating genotype virulence potential (using clinical severity data for these isolates), possible reservoirs of human CDI, and the evolutionary origins of hypervirulent strains.

Mbisa JL, Gupta RK, Kabamba D, Mulenga V, Kalumbi M, Chintu C, Parry CM, Gibb DM, Walker SA, Cane PA, Pillay D. 2011. The evolution of HIV-1 reverse transcriptase in route to acquisition of Q151M multi-drug resistance is complex and involves mutations in multiple domains. Retrovirology, 8 (1), pp. 31. | Show Abstract | Read more

BACKGROUND: The Q151M multi-drug resistance (MDR) pathway in HIV-1 reverse transcriptase (RT) confers reduced susceptibility to all nucleoside reverse transcriptase inhibitors (NRTIs) excluding tenofovir (TDF). This pathway emerges after long term failure of therapy, and is increasingly observed in the resource poor world, where antiretroviral therapy is rarely accompanied by intensive virological monitoring. In this study we examined the genotypic, phenotypic and fitness correlates associated with the development of Q151M MDR in the absence of viral load monitoring. RESULTS: Single-genome sequencing (SGS) of full-length RT was carried out on sequential samples from an HIV-infected individual enrolled in ART rollout. The emergence of Q151M MDR occurred in the order A62V, V75I, and finally Q151M on the same genome at 4, 17 and 37 months after initiation of therapy, respectively. This was accompanied by a parallel cumulative acquisition of mutations at 20 other codon positions; seven of which were located in the connection subdomain. We established that fourteen of these mutations are also observed in Q151M-containing sequences submitted to the Stanford University HIV database. Phenotypic drug susceptibility testing demonstrated that the Q151M-containing RT had reduced susceptibility to all NRTIs except for TDF. RT domain-swapping of patient and wild-type RTs showed that patient-derived connection subdomains were not associated with reduced NRTI susceptibility. However, the virus expressing patient-derived Q151M RT at 37 months demonstrated ~44% replicative capacity of that at 4 months. This was further reduced to ~22% when the Q151M-containing DNA pol domain was expressed with wild-type C-terminal domain, but was then fully compensated by coexpression of the coevolved connection subdomain. CONCLUSIONS: We demonstrate a complex interplay between drug susceptibility and replicative fitness in the acquisition Q151M MDR with serious implications for second-line regimen options. The acquisition of the Q151M pathway occurred sequentially over a long period of failing NRTI therapy, and was associated with mutations in multiple RT domains.

Haberer JE, Cook A, Walker AS, Ngambi M, Ferrier A, Mulenga V, Kityo C, Thomason M, Kabamba D, Chintu C et al. 2011. Excellent adherence to antiretrovirals in HIV+ Zambian children is compromised by disrupted routine, HIV nondisclosure, and paradoxical income effects. PLoS One, 6 (4), pp. e18505. | Show Abstract | Read more

INTRODUCTION: A better understanding of pediatric antiretroviral therapy (ART) adherence in sub-Saharan Africa is necessary to develop interventions to sustain high levels of adherence. METHODOLOGY/PRINCIPAL FINDINGS: Adherence among 96 HIV-infected Zambian children (median age 6, interquartile range [IQR] 2,9) initiating fixed-dose combination ART was measured prospectively (median 23 months; IQR 20,26) with caregiver report, clinic and unannounced home-based pill counts, and medication event monitoring systems (MEMS). HIV-1 RNA was determined at 48 weeks. Child and caregiver characteristics, socio-demographic status, and treatment-related factors were assessed as predictors of adherence. Median adherence was 97.4% (IQR 96.1,98.4%) by visual analog scale, 94.8% (IQR 86,100%) by caregiver-reported last missed dose, 96.9% (IQR 94.5,98.2%) by clinic pill count, 93.4% (IQR 90.2,96.7%) by unannounced home-based pill count, and 94.8% (IQR 87.8,97.7%) by MEMS. At 48 weeks, 72.6% of children had HIV-1 RNA <50 copies/ml. Agreement among adherence measures was poor; only MEMS was significantly associated with viral suppression (p = 0.013). Predictors of poor adherence included changing residence, school attendance, lack of HIV disclosure to children aged nine to 15 years, and increasing household income. CONCLUSIONS/SIGNIFICANCE: Adherence among children taking fixed-dose combination ART in sub-Saharan Africa is high and sustained over two years. However, certain groups are at risk for treatment failure, including children with disrupted routines, no knowledge of their HIV diagnosis among older children, and relatively high household income, possibly reflecting greater social support in the setting of greater poverty.

PENPACT-1 (PENTA 9/PACTG 390) Study Team, Babiker A, Castro nee Green H, Compagnucci A, Fiscus S, Giaquinto C, Gibb DM, Harper L, Harrison L, Hughes M et al. 2011. First-line antiretroviral therapy with a protease inhibitor versus non-nucleoside reverse transcriptase inhibitor and switch at higher versus low viral load in HIV-infected children: an open-label, randomised phase 2/3 trial. Lancet Infect Dis, 11 (4), pp. 273-283. | Show Abstract | Read more

BACKGROUND: Children with HIV will be on antiretroviral therapy (ART) longer than adults, and therefore the durability of first-line ART and timing of switch to second-line are key questions. We assess the long-term outcome of protease inhibitor and non-nucleoside reverse transcriptase inhibitor (NNRTI) first-line ART and viral load switch criteria in children. METHODS: In a randomised open-label factorial trial, we compared effectiveness of two nucleoside reverse transcriptase inhibitors (NRTIs) plus a protease inhibitor versus two NRTIs plus an NNRTI and of switch to second-line ART at a viral load of 1000 copies per mL versus 30,000 copies per mL in previously untreated children infected with HIV from Europe and North and South America. Random assignment was by computer-generated sequentially numbered lists stratified by age, region, and by exposure to perinatal ART. Primary outcome was change in viral load between baseline and 4 years. Analysis was by intention to treat, which we defined as all patients that started treatment. This study is registered with ISRCTN, number ISRCTN73318385. FINDINGS: Between Sept 25, 2002, and Sept 7, 2005, 266 children (median age 6.5 years; IQR 2.8-12.9) were randomly assigned treatment regimens: 66 to receive protease inhibitor and switch to second-line at 1000 copies per mL (PI-low), 65 protease inhibitor and switch at 30,000 copies per mL (PI-higher), 68 NNRTI and switch at 1000 copies per mL (NNRTI-low), and 67 NNRTI and switch at 30,000 copies per mL (NNRTI-higher). Median follow-up was 5.0 years (IQR 4.2-6.0) and 188 (71%) children were on first-line ART at trial end. At 4 years, mean reductions in viral load were -3.16 log(10) copies per mL for protease inhibitors versus -3.31 log(10) copies per mL for NNRTIs (difference -0.15 log(10) copies per mL, 95% CI -0.41 to 0.11; p=0.26), and -3.26 log(10) copies per mL for switching at the low versus -3.20 log(10) copies per mL for switching at the higher threshold (difference 0.06 log(10) copies per mL, 95% CI -0.20 to 0.32; p=0.56). Protease inhibitor resistance was uncommon and there was no increase in NRTI resistance in the PI-higher compared with the PI-low group. NNRTI resistance was selected early, and about 10% more children accumulated NRTI mutations in the NNRTI-higher than the NNRTI-low group. Nine children had new CDC stage-C events and 60 had grade 3/4 adverse events; both were balanced across randomised groups. INTERPRETATION: Good long-term outcomes were achieved with all treatments strategies. Delayed switching of protease-inhibitor-based ART might be reasonable where future drug options are limited, because the risk of selecting for NRTI and protease-inhibitor resistance is low. FUNDING: Paediatric European Network for Treatment of AIDS (PENTA) and Pediatric AIDS Clinical Trials Group (PACTG/IMPAACT).

Prendergast A, Walker AS, Mulenga V, Chintu C, Gibb DM. 2011. Improved growth and anemia in HIV-infected African children taking cotrimoxazole prophylaxis. Clin Infect Dis, 52 (7), pp. 953-956. | Show Abstract | Read more

The impact of cotrimoxazole (CTX) on growth and/or anemia was investigated in 541 human immunodeficiency virus-infected, antiretroviral therapy-naive Zambian children enrolled in the Children with HIV Antibiotic Prophylaxis trial. Compared with children randomized to receive placebo, children randomized to receive CTX had slower decreases in weight-for-age (P=.04) and height-for-age (P=.01), and greater increase in hemoglobin level (P=.01). These findings argue for expanded early CTX use.

Kiwuwa-Muyingo S, Oja H, Walker SA, Ilmonen P, Levin J, Todd J. 2011. Clustering based on adherence data. Epidemiol Perspect Innov, 8 (1), pp. 3. | Show Abstract | Read more

Adherence to a medical treatment means the extent to which a patient follows the instructions or recommendations by health professionals. There are direct and indirect ways to measure adherence which have been used for clinical management and research. Typically adherence measures are monitored over a long follow-up or treatment period, and some measurements may be missing due to death or other reasons. A natural question then is how to describe adherence behavior over the whole period in a simple way. In the literature, measurements over a period are usually combined just by using averages like percentages of compliant days or percentages of doses taken. In the paper we adapt an approach where patient adherence measures are seen as a stochastic process. Repeated measures are then analyzed as a Markov chain with finite number of states rather than as independent and identically distributed observations, and the transition probabilities between the states are assumed to fully describe the behavior of a patient. The patients can then be clustered or classified using their estimated transition probabilities. These natural clusters can be used to describe the adherence of the patients, to find predictors for adherence, and to predict the future events. The new approach is illustrated and shown to be useful with a simple analysis of a data set from the DART (Development of AntiRetroviral Therapy in Africa) trial in Uganda and Zimbabwe.

Thwaites GE, Edgeworth JD, Gkrania-Klotsas E, Kirby A, Tilley R, Török ME, Walker S, Wertheim HF, Wilson P, Llewelyn MJ, UK Clinical Infection Research Group. 2011. Clinical management of Staphylococcus aureus bacteraemia. Lancet Infect Dis, 11 (3), pp. 208-222. | Show Abstract | Read more

Staphylococcus aureus bacteraemia is one of the most common serious bacterial infections worldwide. In the UK alone, around 12,500 cases each year are reported, with an associated mortality of about 30%, yet the evidence guiding optimum management is poor. To date, fewer than 1500 patients with S aureus bacteraemia have been recruited to 16 controlled trials of antimicrobial therapy. Consequently, clinical practice is driven by the results of observational studies and anecdote. Here, we propose and review ten unanswered clinical questions commonly posed by those managing S aureus bacteraemia. Our findings define the major areas of uncertainty in the management of S aureus bacteraemia and highlight just two key principles. First, all infective foci must be identified and removed as soon as possible. Second, long-term antimicrobial therapy is required for those with persistent bacteraemia or a deep, irremovable focus. Beyond this, the best drugs, dose, mode of delivery, and duration of therapy are uncertain, a situation compounded by emerging S aureus strains that are resistant to old and new antibiotics. We discuss the consequences on clinical practice, and how these findings define the agenda for future clinical research.

Walker AS, Gibb DM. 2011. Monitoring of highly active antiretroviral therapy in HIV infection. Curr Opin Infect Dis, 24 (1), pp. 27-33. | Show Abstract | Read more

PURPOSE OF REVIEW: Patients on antiretroviral therapy (ART) in high-income countries have routine laboratory tests to monitor ART efficacy/toxicity. We review studies describing the outcomes and costs of different monitoring approaches, predominantly in low-income countries. RECENT FINDINGS: CD4 cell counts, HIV RNA viral load and clinical events are frequently discordant; viral load suppression occurs with WHO-defined CD4 failure and, as expected, viral load failure often occurs before CD4 failure. Routine CD4 monitoring provides small but significant mortality/morbidity benefits over clinical monitoring, but, at current prices, is not yet cost-effective in many sub-Saharan African countries. Viral load monitoring is less cost-effective with modelling studies reporting variable results. More research into point-of-care tests, methods for targeting monitoring and thresholds for defining failure is needed. Most laboratory monitoring for toxicity is neither effective nor cost-effective. In terms of models for delivery of care, task-shifting with nurse-led and decentralized care appear as effective as doctor-led or centralized care. SUMMARY: Recent studies have improved the evidence base for monitoring on ART. Future research to increase cost-effectiveness by better targeting of monitoring and/or evaluating implementation of less costly point-of-care tests will contribute to long-term success of ART while continuing to increase ART coverage.

Munderi P, Snowden WB, Walker AS, Kityo C, Mosteller M, Kabuye G, Thoofer NK, Ssali F, Gilks CF, Hughes AR, DART Trial Team. 2011. Distribution of HLA-B alleles in a Ugandan HIV-infected adult population: NORA pharmacogenetic substudy of DART. Trop Med Int Health, 16 (2), pp. 200-204. | Show Abstract | Read more

OBJECTIVES: To determine the frequencies of HLA-B alleles in Ugandan patients in the NORA substudy of the DART trial and to compare HLA-B allele frequencies in those with and without clinically diagnosed hypersensitivity reaction (HSR). METHODS: DNA-based HLA-B genotyping was used to determine HLA alleles in 247 participants who received abacavir, including all six participants ('cases') with clinically diagnosed abacavir HSR. RESULTS: The incidence of clinical abacavir HSR in this double-blinded study was 2.0% (6/300) in the abacavir group. As HLA-B*5701 was absent throughout the entire cohort, including the six HSR 'cases', an association could not be established between HLA-B*5701 and clinically diagnosed abacavir HSR. No other HLA-B*57 alleles were present among the six 'cases'. HLA-B*5703 was the most frequent HLA-B*57 allele among the abacavir-tolerant participants. CONCLUSION: The rate of clinical HSR was low, which may reflect the expected 2-3% clinical false-positive rate seen in previous double-blind randomized studies. The presumption that these cases may be false-positive abacavir HSR is supported by the fact that no HLA-B*5701 alleles were found in the abacavir group. Implementation of prospective HLA-B*5701 screening must be based on benefit/risk considerations within local practice. Clinical risk management remains paramount.

Finney JM, Walker AS, Peto TEA, Wyllie DH. 2011. An efficient record linkage scheme using graphical analysis for identifier error detection. BMC Med Inform Decis Mak, 11 (1), pp. 7. | Show Abstract | Read more

BACKGROUND: Integration of information on individuals (record linkage) is a key problem in healthcare delivery, epidemiology, and "business intelligence" applications. It is now common to be required to link very large numbers of records, often containing various combinations of theoretically unique identifiers, such as NHS numbers, which are both incomplete and error-prone. METHODS: We describe a two-step record linkage algorithm in which identifiers with high cardinality are identified or generated, and used to perform an initial exact match based linkage. Subsequently, the resulting clusters are studied and, if appropriate, partitioned using a graph based algorithm detecting erroneous identifiers. RESULTS: The system was used to cluster over 250 million health records from five data sources within a large UK hospital group. Linkage, which was completed in about 30 minutes, yielded 3.6 million clusters of which about 99.8% contain, with high likelihood, records from one patient. Although computationally efficient, the algorithm's requirement for exact matching of at least one identifier of each record to another for cluster formation may be a limitation in some databases containing records of low identifier quality. CONCLUSIONS: The technique described offers a simple, fast and highly efficient two-step method for large scale initial linkage for records commonly found in the UK's National Health Service.

Serwanga J, Mugaba S, Betty A, Pimego E, Walker S, Munderi P, Gilks C, Gotch F, Grosskurth H, Kaleebu P. 2011. CD8 T-Cell Responses before and after Structured Treatment Interruption in Ugandan Adults Who Initiated ART with CD4 T Cells <200 Cell/μL: The DART Trial STI Substudy. AIDS Res Treat, 2011 pp. 875028. | Show Abstract | Read more

Objective. To better understand attributes of ART-associated HIV-induced T-cell responses that might be therapeutically harnessed. Methods. CD8(+) T-cell responses were evaluated in some HIV-1 chronically infected participants of the fixed duration STI substudy of the DART trial. Magnitudes, breadths, and functionality of IFN-γ and Perforin responses were compared in STI (n = 42) and continuous treatment (CT) (n = 46) before and after a single STI cycle when the DART STI trial was stopped early due to inferior clinical outcome in STI participants. Results. STI and CT had comparable magnitudes and breadths of monofunctional CD8(+)IFNγ(+) and CD8(+)Perforin(+) responses. However, STI was associated with significant decline in breadth of bi-functional (CD8(+)IFNγ(+)Perforin(+)) responses; P = .02, Mann-Whitney test. Conclusions. STI in individuals initiated onto ART at <200 CD4(+) T-cell counts/μl significantly reduced occurrence of bifunctional CD8(+)IFNγ(+)/Perforin(+) responses. These data add to others that found no evidence to support STI as a strategy to improve HIV-specific immunity during ART.

Mulenga V, Cook A, Walker AS, Kabamba D, Chijoka C, Ferrier A, Kalengo C, Kityo C, Kankasa C, Burger D et al. 2010. Strategies for nevirapine initiation in HIV-infected children taking pediatric fixed-dose combination "baby pills" in Zambia: a randomized controlled trial. Clin Infect Dis, 51 (9), pp. 1081-1089. | Show Abstract | Read more

BACKGROUND: Fixed-dose combination scored dispersible stavudine, lamivudine, and nevirapine minitablets (Triomune Baby and Junior; Cipla Ltd) are simpler and cheaper than liquid formulations and have correct dose ratios for human immunodeficiency virus-infected children. However, they cannot be used for dose escalation (DE) of nevirapine. METHODS: Children were randomized to initiate antiretroviral therapy with full-dose (FD) nevirapine (Triomune Baby or Junior in the morning and evening) versus DE (half-dose nevirapine for 14 days [Triomune in the morning and stavudine-lamivudine {Lamivir-S} in the evening], then FD), in accordance with World Health Organization weight-band dosing tables. The primary end point was nevirapine-related clinical or laboratory grade 3 or 4 adverse events (AEs). RESULTS: In total, 211 children (median [interquartile range {IQR}] age, 5 [ 2-9 ] years; median [IQR] CD4 cell percentage, 13% [8%-18%]) were enrolled and followed up for a median (IQR) of 92 (68-116) weeks. There were 31 grade 3 or 4 AEs that were definitely/probably or uncertainly related to nevirapine in the FD group (18.0 per 100 child-years), compared with 29 in the DE group (16.5 per 100 child-years) (incidence rate ratio, 1.09; 95% confidence interval, 0.63&#x2013;1.87; P = .74). All were asymptomatic; 11 versus 3 were single grade 3 or 4 elevations in alanine aminotransferase (ALT) or aspartate aminotransferase (AST) levels, all of which resolved without a change in nevirapine dose or interruption. Thirteen (12%) FD versus 2 (2%) DE children had grade 1 (2 in FD) or grade 2 (11 in FD and 2 in DE) rashes. Three (2 in FD and 1 in DE) substituted efavirenz, 3 (FD) continued FD nevirapine, and 9 (8 in FD and 1 in DE) temporarily interrupted nevirapine, followed by successful DE. Predictors of nevirapine rash were older age (P = .003) and higher CD4 cell count for age (P = .03). Twenty-two children died (12 in FD and 10 in DE), 1 FD and 5 DE children at <4 weeks; none were considered to be drug related by independent review. CONCLUSIONS: Rash was more frequent with FD nevirapine, but 88% had no clinical toxicity; elevated AST or ALT levels were transient and resolved spontaneously, suggesting that routine laboratory monitoring has limited value. Dual pediatric stavudine-lamivudine minitablets are preferred for safe and simple DE; if unavailable, initiating FD Triomune requires timely review for rash, which could be managed by temporary reduction to half-dose Triomune or efavirenz substitution. TRIAL REGISTRATION: Current Controlled Trials identifier: ISRCTN31084535 .

Siddique D, Hyare H, Wroe S, Webb T, Macfarlane R, Rudge P, Collinge J, Powell C, Brandner S, So P-W et al. 2010. Magnetization transfer ratio may be a surrogate of spongiform change in human prion diseases. Brain, 133 (10), pp. 3058-3068. | Show Abstract | Read more

Human prion diseases are fatal neurodegenerative disorders caused by misfolding of the prion protein. There are no useful biomarkers of disease progression. Cerebral cortex spongiform change, one of the classical pathological features of prion disease, resolves in prion-infected transgenic mice following prion protein gene knockout. We investigated the cross-sectional, longitudinal and post-mortem cerebral magnetization transfer ratios as a surrogate for prion disease pathology. Twenty-three prion disease patients with various prion protein gene mutations and 16 controls underwent magnetization transfer ratio and conventional magnetic resonance imaging at 1.5 T. For each subject, whole-brain, white and grey matter magnetization transfer ratio histogram mean, peak height, peak location, and magnetization transfer ratio at 25th, 50th and 75th percentile were computed and correlated with several cognitive, functional and neuropsychological scales. Highly significant associations were found between whole brain magnetization transfer ratio and prion disease (P < 0.01). Additionally, highly significant correlations were found between magnetization transfer ratio histogram parameters and clinical, functional and neuropsychological scores (P < 0.01). Longitudinally, decline in the Clinician's Dementia Rating scale was correlated with decline in magnetization transfer ratio. To investigate the histological correlates of magnetization transfer ratio, formalin-fixed cerebral and cerebellar hemispheres from 19 patients and six controls underwent magnetization transfer ratio imaging at 1.5 T, with mean magnetization transfer ratio calculated from six regions of interest, and findings were followed-up in six variant Creutzfeldt-Jakob disease cases with 9.4 T high-resolution magnetization transfer imaging on frontal cortex blocks, with semi-quantitative histopathological scoring of spongiosis, astrocytosis and prion protein deposition. Post-mortem magnetization transfer ratios was significantly lower in patients than controls in multiple cortical and subcortical regions, but not frontal white matter. Measurements (9.4 T) revealed a significant and specific negative correlation between cortical magnetization transfer ratios and spongiosis (P = 0.02), but not prion protein deposition or gliosis. The magnetic resonance imaging measurement of magnetization transfer ratios may be an in vivo surrogate for spongiform change and has potential utility as a therapeutic biomarker in human prion disease.

Foster D, Walker AS, Paul J, Griffiths D, Knox K, Peto TE, Crook DW, Oxford Invasive Pneumococcal Surveillance Group. 2011. Reduction in invasive pneumococcal disease following implementation of the conjugate vaccine in the Oxfordshire region, England. J Med Microbiol, 60 (Pt 1), pp. 91-97. | Show Abstract | Read more

Pneumococcal conjugate vaccine to seven capsular types has been highly effective in the US since its introduction in 2000. The same vaccine was adopted by the UK in 2006. Ongoing surveillance since 1995 of invasive pneumococcal disease (IPD) in Oxfordshire, UK, allowed assessment of the impact of vaccine intervention. The vaccine significantly reduced IPD among the target group, children under 2 years of age; incidence rate ratio (IRR)=0.62 (95 % CI 0.43-0.90) (P=0.008) comparing the 3 years pre- and post-implementation with a residual incidence of 22.4/100 000 children. The reduction was even greater when comparing 11 years pre- with the 3 years post-implementation of vaccine; IRR=0.53 (0.39-0.70) (P<0.0001). There was a marked direct effect of the vaccine evidenced by substantial reductions in the seven serotypes contained in the vaccine. There was also a clear reduction in IPD for those serotypes contained in the vaccine among those older than 2 years when comparing both the 3 and 11 year pre-PCV7 time periods, with IRR=0.57 (0.47-0.69) (P<0.0001) and IRR=0.50 (0.43-0.58) (P<0.0001), respectively, indicating a strong herd effect. There was a significant, though moderate, rise in the serotypes not contained in the vaccine, with clear evidence for replacement in some serotypes.

Mwenya DM, Charalambous BM, Phillips PPJ, Mwansa JCL, Batt SL, Nunn AJ, Walker S, Gibb DM, Gillespie SH. 2010. Impact of cotrimoxazole on carriage and antibiotic resistance of Streptococcus pneumoniae and Haemophilus influenzae in HIV-infected children in Zambia. Antimicrob Agents Chemother, 54 (9), pp. 3756-3762. | Show Abstract | Read more

This is a substudy of a larger randomized controlled trial on HIV-infected Zambian children, which revealed that cotrimoxazole prophylaxis reduced morbidity and mortality despite a background of high cotrimoxazole resistance. The impact of cotrimoxazole on the carriage and antibiotic resistance of Streptococcus pneumoniae and Haemophilus influenzae as major causes of childhood mortality in HIV-infected children was investigated since these are unclear. Representative nasopharyngeal swabs were taken prior to randomization for 181 of 534 children (92 on cotrimoxazole and 89 on placebo). Bacterial identification and antibiotic susceptibility were performed by routine methods. Due to reduced mortality, prophylactic cotrimoxazole increased the median time from randomization to the last specimen from 48 to 56 months (P = 0.001). The carriage of H. influenzae was unaltered by cotrimoxazole. Carriage of S. pneumoniae increased slightly in both arms but was not statistically significant in the placebo arm. In S. pneumoniae switching between carriage and no carriage in consecutive pairs of samples was unaffected by cotrimoxazole (P = 0.18) with a suggestion that the probability of remaining carriage free was lower (P = 0.10). In H. influenzae cotrimoxazole decreased switching from carriage to no carriage (P = 0.02). Cotrimoxazole resistance levels were higher in postbaseline samples in the cotrimoxazole arm than in the placebo arm (S. pneumoniae, P < 0.0001; H. influenzae, P = 0.005). Cotrimoxazole decreased switching from cotrimoxazole resistance to cotrimoxazole sensitivity in S. pneumoniae (P = 0.002) and reduced the chance of H. influenzae remaining cotrimoxazole sensitive (P = 0.05). No associations were observed between the percentage of CD4 (CD4%), the change in CD4% from baseline, child age at date of specimen, child gender, or sampling month with carriage of either pathogen.

Gupta RK, Ford D, Mulenga V, Walker AS, Kabamba D, Kalumbi M, Grant PR, Ferrier A, Pillay D, Gibb DM, Chintu C. 2010. Drug resistance in human immunodeficiency virus type-1 infected Zambian children using adult fixed dose combination stavudine, lamivudine, and nevirapine. Pediatr Infect Dis J, 29 (8), pp. e57-e62. | Show Abstract | Read more

BACKGROUND: There are few medium-term virologic data in children from resource-limited settings taking adult fixed-dose-combination antiretroviral therapy (cART) without viral load monitoring. METHODS: CHAP2 (Children with HIV Antibiotic Prophylaxis 2) is a prospective cohort of Zambian children using d4T/3TC/NVP adult Triomune30 dosed according to WHO guidelines. RESULTS: A total of 103 children (19 with previous antiretroviral therapy) had follow-up >6 months. Median age at cART initiation was 8 years (IQR, 6-12) and CD4 8% (4-12). At 24 months, CD4% had increased by a median of 15% (7-25). For 74 children viral load was known/inferred: 51 of 74 (69%) had viral load <50 copies/mL (45 of 63 [71%] with no previous cART, 6 of 11 [55%] with previous cART; difference P = 0.30); 22 of 74 (30%) had viral load >1000 copies/mL. Of 26 children with resistance data, 25 (96%) had NNRTI resistance; 22 (84%) had M184V; 2 (8%) had Q151M; and 1 (4%) each had K65R, L74V, or K70E. Eight (31%) had > or =1 TAM. Those failing virologically with a genotypic sensitivity score of 0 for first-line therapy had a somewhat smaller increase in CD4% from baseline compared with those failing therapy with a genotypic sensitivity score >0 (+3 vs. +8, P = 0.13), and had somewhat lower CD4% at initiation of cART (2 vs. 11, P = 0.09). In 6 children with >1 resistance test, the estimated rate of accumulation of TAMs was 0.59/yr (95% confidence interval: 0.22-1.29). CONCLUSIONS: Twenty-four month virologic responses to cART were good. However, the rate of TAM accumulation in those with rebound was higher than reported in Western adult cohorts, and there was some indication of a detrimental effect of high level resistance on CD4% change from baseline.

Kityo C, Walker AS, Dickinson L, Lutwama F, Kayiwa J, Ssali F, Nalumenya R, Tumukunde D, Munderi P, Reid A et al. 2010. Pharmacokinetics of lopinavir-ritonavir with and without nonnucleoside reverse transcriptase inhibitors in Ugandan HIV-infected adults. Antimicrob Agents Chemother, 54 (7), pp. 2965-2973. | Show Abstract | Read more

We evaluated the pharmacokinetics of lopinavir-ritonavir with and without nonnucleoside reverse transcriptase inhibitors (NNRTIs) in Ugandan adults. The study design was a three-period crossover study (3 tablets [600 mg of lopinavir/150 mg of ritonavir {600/150 mg}], 4 capsules [533/133 mg], and 2 tablets [400/100 mg] twice a day [BD]; n = 40) of lopinavir-ritonavir with NNRTIs and a parallel one-period study (2 tablets BD; n = 20) without NNRTIs. Six-point pharmacokinetic sampling (0, 2, 4, 6, 8, and 12 h) was undertaken after observed intake with a standardized breakfast. Ugandan DART trial participants receiving efavirenz (n = 20), nevirapine (n = 18), and no NNRTI (n = 20) had median ages of 41, 35, and 37 years, respectively, and median weights of 60, 64, and 63 kg, respectively. For the no-NNRTI group, the geometric mean (percent coefficient of variation [%CV]) lopinavir area under the concentration-time curve from 0 to 12 h (AUC(0-12)) was 110.1 (34%) microg x h/liter. For efavirenz, the geometric mean lopinavir AUC(0-12) (%CV) values were 91.8 microg x h/liter (58%), 65.7 microg x h/liter (39%), and 54.0 microg x h/liter (65%) with 3 tablets, 4 capsules, and 2 tablets BD, respectively, with corresponding (within-individual) geometric mean ratios (GMR) for 3 and 2 tablets versus 4 capsules of 1.40 (90% confidence interval [CI], 1.18 to 1.65; P = 0.002) and 0.82 (90% CI, 0.68 to 0.99; P = 0.09), respectively, and the apparent oral clearance (CL/F) values were reduced by 58% and 1%, respectively. For nevirapine, the geometric mean lopinavir AUC(0-12) (%CV) values were 112.9 microg x h/liter (30%), 68.1 microg x h/liter (53%), and 61.5 microg x h/liter (52%), respectively, with corresponding GMR values of 1.66 (90% CI, 1.46 to 1.88; P < 0.001) and 0.90 (90% CI, 0.77 to 1.06; P = 0.27), respectively, and the CL/F was reduced by 57% and 7%, respectively. Higher values for the lopinavir concentration at 12 h (C(12)) were observed with 3 tablets and efavirenz-nevirapine (P = 0.04 and P = 0.0005, respectively), and marginally lower C(12) values were observed with 2 tablets and efavirenz-nevirapine (P = 0.08 and P = 0.26, respectively). These data suggest that 2 tablets of lopinavir-ritonavir BD may be inadequate when dosed with NNRTIs in Ugandan adults, and the dosage should be increased by the addition of an additional adult tablet or a half-dose tablet (100/25 mg), where available.

Burger D, Ewings F, Kabamba D, L'homme R, Mulenga V, Kankasa C, Thomason M, Gibb DM, Chintu C, Walker AS. 2010. Limited sampling models to predict the pharmacokinetics of nevirapine, stavudine, and lamivudine in HIV-infected children treated with pediatric fixed-dose combination tablets. Ther Drug Monit, 32 (3), pp. 369-372. | Show Abstract | Read more

Full 12-hour pharmacokinetic profiles of nevirapine, stavudine, and lamivudine in HIV-infected children taking fixed-dose combination antiretroviral tablets have been reported previously by us. Further studies with these formulations could benefit from less-intensive pharmacokinetic sampling. Data from 65 African children were used to relate area under the plasma concentration versus time curve over 12 hours (AUC) to plasma concentrations of nevirapine, stavudine, or lamivudine at times t = 0, 1, 2, 4, 6, 8, and 12 hours after intake using linear regression. Limited sampling models were developed using leave-one-out crossvalidation. The predictive performance of each model was evaluated using the mean relative prediction error (mpe%) as an indicator of bias and the root mean squared relative prediction error (rmse%) as a measure of precision. A priori set criteria to accept a limited sampling model were: 95% confidence limit of the mpe% should include 0, rmse% less than 10%, a high correlation coefficient, and as few (convenient) samples as possible. Using only one sample did not lead to acceptable AUC predictions for stavudine or lamivudine, although the 6-hour sample was acceptable for nevirapine (mpe%: -0.8%, 95% confidence interval: -2.2 to +0.6); rmse%: 5.8%; r: 0.98). Using two samples, AUC predictions for stavudine and lamivudine improved considerably but did not meet the predefined acceptance criteria. Using three samples (1, 2, 6 hours), an accurate and precise limited sampling model for stavudine AUC (mpe%: -0.6%, 95% confidence interval: -2.2 to +1.0; rmse%: 6.5%; r: 0.98) and lamivudine AUC (mpe%: -0.3%, 95% confidence interval: -1.7 to +1.1; rmse%: 5.6%; r: 0.99) was found; this model was also highly accurate and precise for nevirapine AUC (mpe%: -0.2%, 95% confidence interval: -1.0 to +0.7; rmse%: 3.4%; r: 0.99). A limited sampling model using three time points (1, 2, 6 hours) can be used to predict nevirapine, stavudine, and lamivudine AUC accurately and precisely in HIV-infected African children.

McCormick AL, Goodall RL, Joyce A, Ndembi N, Chirara M, Katundu P, Walker S, Yirrell D, Gilks CF, Pillay D, DART Virology Group and Trial Team. 2010. Lack of minority K65R-resistant viral populations detected after repeated treatment interruptions of tenofovir/zidovudine and lamivudine in a resource-limited setting. J Acquir Immune Defic Syndr, 54 (2), pp. 215-216. | Read more

Miller R, Walker AS, Knox K, Wyllie D, Paul J, Haworth E, Mant D, Peto T, Crook DW. 2010. 'Feral' and 'wild'-type methicillin-resistant Staphylococcus aureus in the United Kingdom. Epidemiol Infect, 138 (5), pp. 655-665. | Show Abstract | Read more

Circulation of methicillin-resistant Staphylococcus aureus (MRSA) outside hospitals could alter the impact of hospital-based control strategies. We investigated two groups of cases (each matched to controls with MRSA): 61 'community cases' not in acute hospital in the year before MRSA isolation; and 21 cases with ciprofloxacin-sensitive (CipS) MRSA. Multi-locus sequence typing, spa-typing and Panton-Valentine leukocidin gene testing were performed and demographics obtained. Additional questionnaires were completed by community case GPs. Community cases comprised 6% of Oxfordshire MRSA. Three community cases had received no regular healthcare or antibiotics: one was infected with CipS. Ninety-one percent of community cases had healthcare-associated sequence type (ST)22/36; CipS MRSA cases had heterogeneous STs but many had recent healthcare exposure. A substantial minority of UK MRSA transmission may occur outside hospitals. Hospital strains are becoming 'feral' or persisting in long-term carriers in the community with regular healthcare contacts; those with recent healthcare exposure may nevertheless acquire non-hospital epidemic MRSA strains in the community.

Munderi P, Walker AS, Kityo C, Babiker AG, Ssali F, Reid A, Darbyshire JH, Grosskurth H, Mugyenyi P, Gibb DM et al. 2010. Nevirapine/zidovudine/lamivudine has superior immunological and virological responses not reflected in clinical outcomes in a 48-week randomized comparison with abacavir/zidovudine/lamivudine in HIV-infected Ugandan adults with low CD4 cell counts. HIV Med, 11 (5), pp. 334-344. | Show Abstract | Read more

BACKGROUND: Triple nucleoside reverse transcriptase inhibitor regimens have advantages as first-line antiretroviral therapy (ART), avoiding hepatotoxicity and interactions with anti-tuberculosis therapy, and sparing two drug classes for second-line ART. Concerns exist about virological potency; efficacy has not been assessed in Africa. METHODS: A safety trial comparing nevirapine with abacavir was conducted in two Ugandan Development of Antiretroviral Therapy in Africa (DART) centres: 600 symptomatic antiretroviral-naïve HIV-infected adults with CD4 counts <200 cells/microL were randomized to zidovudine/lamivudine plus abacavir or nevirapine (placebo-controlled to 24-week primary toxicity endpoint, and then open-label). Documented World Health Organization (WHO) stage 4 events were independently reviewed and plasma HIV-1 RNA assayed retrospectively. Exploratory efficacy analyses are intention-to-treat. RESULTS: The median pre-ART CD4 count was 99 cells/microL, and the median pre-ART viral load was 284 600 HIV-1 RNA copies/mL. A total of 563 participants (94%) completed 48 weeks of follow-up, 25 (4%) died and 12 (2%) were lost to follow-up. The randomized drug was substituted in 21 participants (7%) receiving abacavir vs. 34 (11%) receiving nevirapine (P=0.09). At 48 weeks, 62% of participants receiving abacavir vs. 77% of those receiving nevirapine had viral loads <50 copies/mL (P<0.001), and mean CD4 count increases from baseline were +147 vs. +173 cells/microL, respectively (P=0.006). Nine participants (3%) receiving abacavir vs. 16 (5%) receiving nevirapine died [hazard ratio (HR) 0.55; 95% confidence interval (CI) 0.24-1.25; P=0.15]; 20 receiving abacavir vs. 32 receiving nevirapine developed new or recurrent WHO 4 events or died (HR=0.60; 95% CI 0.34-1.05; P=0.07) and 48 receiving abacavir vs. 68 receiving nevirapine developed new or recurrent WHO 3 or 4 events or died (HR=0.67; 95% CI 0.46-0.96; P=0.03). Seventy-one participants (24%) receiving abacavir experienced 91 grade 4 adverse events compared with 130 events in 109 participants (36%) on nevirapine (P<0.001). CONCLUSIONS: The clear virological/immunological superiority of nevirapine over abacavir was not reflected in clinical outcomes over 48 weeks. The inability of CD4 cell count/viral load to predict initial clinical treatment efficacy is unexplained and requires further evaluation.

Walker AS, Ford D, Gilks CF, Munderi P, Ssali F, Reid A, Katabira E, Grosskurth H, Mugyenyi P, Hakim J et al. 2010. Daily co-trimoxazole prophylaxis in severely immunosuppressed HIV-infected adults in Africa started on combination antiretroviral therapy: an observational analysis of the DART cohort. Lancet, 375 (9722), pp. 1278-1286. | Show Abstract | Read more

BACKGROUND: Co-trimoxazole prophylaxis can reduce mortality from untreated HIV infection in Africa; whether benefits occur alongside combination antiretroviral therapy (ART) is unclear. We estimated the effect of prophylaxis after ART initiation in adults. METHODS: Participants in our observational analysis were from the DART randomised trial of management strategies in HIV-infected, symptomatic, previously untreated African adults starting triple-drug ART with CD4 counts lower than 200 cells per muL. Co-trimoxazole prophylaxis was not routinely used or randomly allocated, but was variably prescribed by clinicians. We estimated effects on clinical outcomes, CD4 cell count, and body-mass index (BMI) using marginal structural models to adjust for time-dependent confounding by indication. DART was registered, number ISRCTN13968779. FINDINGS: 3179 participants contributed 14 214 years of follow-up (8128 [57%] person-years on co-trimoxazole). Time-dependent predictors of co-trimoxazole use were current CD4 cell count, haemoglobin concentration, BMI, and previous WHO stage 3 or 4 events on ART. Present prophylaxis significantly reduced mortality (odds ratio 0.65, 95% CI 0.50-0.85; p=0.001). Mortality risk reduction on ART was substantial to 12 weeks (0.41, 0.27-0.65), sustained from 12-72 weeks (0.56, 0.37-0.86), but not evident subsequently (0.96, 0.63-1.45; heterogeneity p=0.02). Variation in mortality reduction was not accounted for by time on co-trimoxazole or current CD4 cell count. Prophylaxis reduced frequency of malaria (0.74, 0.63-0.88; p=0.0005), an effect that was maintained with time, but we observed no effect on new WHO stage 4 events (0.86, 0.69-1.07; p=0.17), CD4 cell count (difference vs non-users, -3 cells per muL [-12 to 6]; p=0.50), or BMI (difference vs non-users, -0.04 kg/m(2) [-0.20 to 0.13); p=0.68]. INTERPRETATION: Our results reinforce WHO guidelines and provide strong motivation for provision of co-trimoxazole prophylaxis for at least 72 weeks for all adults starting combination ART in Africa. FUNDING: UK Medical Research Council, the UK Department for International Development, the Rockefeller Foundation, GlaxoSmithKline, Gilead Sciences, Boehringer-Ingelheim, and Abbott Laboratories.

Sayana S, Manchanda R, Khanlou H, Saavedra J, Reis P, Weinstein M. 2010. DART and laboratory monitoring of HIV treatment. Lancet, 375 (9719), pp. 979. | Read more

Griffiths D, Fawley W, Kachrimanidou M, Bowden R, Crook DW, Fung R, Golubchik T, Harding RM, Jeffery KJM, Jolley KA et al. 2010. Multilocus sequence typing of Clostridium difficile. J Clin Microbiol, 48 (3), pp. 770-778. | Show Abstract | Read more

A robust high-throughput multilocus sequence typing (MLST) scheme for Clostridium difficile was developed and validated using a diverse collection of 50 reference isolates representing 45 different PCR ribotypes and 102 isolates from recent clinical samples. A total of 49 PCR ribotypes were represented overall. All isolates were typed by MLST and yielded 40 sequence types (STs). A web-accessible database was set up (http://pubmlst.org/cdifficile/) to facilitate the dissemination and comparison of C. difficile MLST genotyping data among laboratories. MLST and PCR ribotyping were similar in discriminatory abilities, having indices of discrimination of 0.90 and 0.92, respectively. Some STs corresponded to a single PCR ribotype (32/40), other STs corresponded to multiple PCR ribotypes (8/40), and, conversely, the PCR ribotype was not always predictive of the ST. The total number of variable nucleotide sites in the concatenated MLST sequences was 103/3,501 (2.9%). Concatenated MLST sequences were used to construct a neighbor-joining tree which identified four phylogenetic groups of STs and one outlier (ST-11; PCR ribotype 078). These groups apparently correlate with clades identified previously by comparative genomics. The MLST scheme was sufficiently robust to allow direct genotyping of C. difficile in total stool DNA extracts without isolate culture. The direct (nonculture) MLST approach may prove useful as a rapid genotyping method, potentially benefiting individual patients and informing hospital infection control.

DART Trial Team, Mugyenyi P, Walker AS, Hakim J, Munderi P, Gibb DM, Kityo C, Reid A, Grosskurth H, Darbyshire JH et al. 2010. Routine versus clinically driven laboratory monitoring of HIV antiretroviral therapy in Africa (DART): a randomised non-inferiority trial. Lancet, 375 (9709), pp. 123-131. | Show Abstract | Read more

BACKGROUND: HIV antiretroviral therapy (ART) is often managed without routine laboratory monitoring in Africa; however, the effect of this approach is unknown. This trial investigated whether routine toxicity and efficacy monitoring of HIV-infected patients receiving ART had an important long-term effect on clinical outcomes in Africa. METHODS: In this open, non-inferiority trial in three centres in Uganda and one in Zimbabwe, 3321 symptomatic, ART-naive, HIV-infected adults with CD4 counts less than 200 cells per microL starting ART were randomly assigned to laboratory and clinical monitoring (LCM; n=1659) or clinically driven monitoring (CDM; n=1662) by a computer-generated list. Haematology, biochemistry, and CD4-cell counts were done every 12 weeks. In the LCM group, results were available to clinicians; in the CDM group, results (apart from CD4-cell count) could be requested if clinically indicated and grade 4 toxicities were available. Participants switched to second-line ART after new or recurrent WHO stage 4 events in both groups, or CD4 count less than 100 cells per microL (LCM only). Co-primary endpoints were new WHO stage 4 HIV events or death, and serious adverse events. Non-inferiority was defined as the upper 95% confidence limit for the hazard ratio (HR) for new WHO stage 4 events or death being no greater than 1.18. Analyses were by intention to treat. This study is registered, number ISRCTN13968779. FINDINGS: Two participants assigned to CDM and three to LCM were excluded from analyses. 5-year survival was 87% (95% CI 85-88) in the CDM group and 90% (88-91) in the LCM group, and 122 (7%) and 112 (7%) participants, respectively, were lost to follow-up over median 4.9 years' follow-up. 459 (28%) participants receiving CDM versus 356 (21%) LCM had a new WHO stage 4 event or died (6.94 [95% CI 6.33-7.60] vs 5.24 [4.72-5.81] per 100 person-years; absolute difference 1.70 per 100 person-years [0.87-2.54]; HR 1.31 [1.14-1.51]; p=0.0001). Differences in disease progression occurred from the third year on ART, whereas higher rates of switch to second-line treatment occurred in LCM from the second year. 283 (17%) participants receiving CDM versus 260 (16%) LCM had a new serious adverse event (HR 1.12 [0.94-1.32]; p=0.19), with anaemia the most common (76 vs 61 cases). INTERPRETATION: ART can be delivered safely without routine laboratory monitoring for toxic effects, but differences in disease progression suggest a role for monitoring of CD4-cell count from the second year of ART to guide the switch to second-line treatment. FUNDING: UK Medical Research Council, the UK Department for International Development, the Rockefeller Foundation, GlaxoSmithKline, Gilead Sciences, Boehringer-Ingelheim, and Abbott Laboratories.

Musiime V, Kendall L, Bakeera-Kitaka S, Snowden WB, Odongo F, Thomason M, Musoke P, Adkison K, Burger D, Mugyenyi P et al. 2010. Pharmacokinetics and acceptability of once- versus twice-daily lamivudine and abacavir in HIV type-1-infected Ugandan children in the ARROW Trial. Antivir Ther, 15 (8), pp. 1115-1124. | Show Abstract | Read more

BACKGROUND: No data on once-daily dosing of nucleoside analogues in African children currently exist. We compared the pharmacokinetics (PK) of once- versus twice-daily lamivudine and abacavir treatment using the World Health Organization recommended weight band dosing of scored adult tablets. METHODS: HIV type-1 (HIV-1)-infected Ugandan children aged 3-12 years receiving antiretroviral therapy that included lamivudine and abacavir twice daily (total 150+300 mg, 225+450 mg and 225/300+600 mg daily for 12-<20, 20-<25 and ≥25 kg, respectively) were enrolled in a crossover study. Plasma PK sampling (at 0, 1, 2, 4, 6, 8 and 12 h after observed morning intake) was performed for the twice-daily regimen at steady-state. Children were then switched to once-daily treatment with PK sampling repeated 4 weeks later (with an additional 24 h sample). Acceptability questionnaires were completed at both time points. Daily area under the curve (AUC(0-24)) and maximum concentrations (C(max)) were compared by geometric mean ratios (GMRs). RESULTS: A total of 41 HIV-1-infected children (median age of 7 years) and n=23, n=14 and n=4 in 12-<20, 20-<25 and ≥25 kg weight bands, respectively, were enrolled. Mean AUC(0-24) was 13.0 and 12.0 mg•h/l for once- and twice-daily lamivudine (GMR 1.09, 90% confidence intervals [CI] 0.98-1.20) and 15.3 and 15.6 mg•h/l for once- and twice-daily abacavir (GMR 0.98, 90% CI 0.89-1.08), respectively, with no difference in 3-6 versus 7-12 year olds. C(max) was 76% (lamivudine) and 64% (abacavir) higher on once-daily regimens. For both children and caregivers, once-daily dosing of lamivudine plus abacavir was highly acceptable and strongly preferred over twice-daily. CONCLUSIONS: In children aged 3-12 years, AUC(0-24) of lamivudine and abacavir were bioequivalent on once- and twice-daily regimens. Once-daily dosing of abacavir and lamivudine could provide an alternative dosing strategy for HIV-1-infected children, with high acceptability and strong preference suggesting the potential for improved adherence.

Ndembi N, Goodall RL, Dunn DT, McCormick A, Burke A, Lyagoba F, Munderi P, Katundu P, Kityo C, Robertson V et al. 2010. Viral rebound and emergence of drug resistance in the absence of viral load testing: a randomized comparison between zidovudine-lamivudine plus Nevirapine and zidovudine-lamivudine plus Abacavir. J Infect Dis, 201 (1), pp. 106-113. | Show Abstract | Read more

BACKGROUND: We investigated virological response and the emergence of resistance in the Nevirapine or Abacavir (NORA) substudy of the Development of Antiretroviral Treatment in Africa (DART) trial. METHODS: Six hundred symptomatic antiretroviral-naive human immunodeficiency virus (HIV)-infected adults (CD4 cell count, <200 cells/mm(3)) from 2 Ugandan centers were randomized to receive zidovudine-lamivudine plus abacavir or nevirapine. Virology was performed retrospectively on stored plasma samples at selected time points. In patients with HIV RNA levels >1000 copies/mL, the residual activity of therapy was calculated as the reduction in HIV RNA level, compared with baseline. RESULTS: Overall, HIV RNA levels were lower in the nevirapine group than in the abacavir group at 24 and 48 weeks (P < .001), although no differences were observed at weeks 4 and 12. Virological responses were similar in the 2 treatment groups for baseline HIV RNA level <100,000 copies/mL. The mean residual activity at week 48 was higher for abacavir in the presence of the typically observed resistance pattern of thymidine analogue mutations (TAMs) and M184V (1.47 log(10) copies/mL) than for nevirapine with M184V and nonnucleoside reverse-transcriptase inhibitor mutations, whether accompanied by TAMs (0.96 log(10) copies/mL) or not (1.18 log(10) copies/mL). CONCLUSIONS: There was more extensive genotypic resistance in both treatment groups than is generally seen in resource-rich settings. However, significant residual activity was observed among patients with virological failure, particularly those receiving zidovudine-lamivudine plus abacavir.

Thwaites GE, United Kingdom Clinical Infection Research Group (UKCIRG). 2010. The management of Staphylococcus aureus bacteremia in the United Kingdom and Vietnam: a multi-centre evaluation. PLoS One, 5 (12), pp. e14170. | Show Abstract | Read more

BACKGROUND: Staphylococcus aureus bacteremia is a common and serious infection worldwide and although treatment guidelines exist, there is little consensus on optimal management. In this study we assessed the variation in management and adherence to treatment guidelines of S. aureus bacteremia. METHODOLOGY/PRINCIPAL FINDINGS: We prospectively recorded baseline clinical characteristics, management, and in-hospital outcome of all adults with S. aureus bacteremia treated consecutively over one year in eight centres in the United Kingdom, three in Vietnam and one in Nepal. 630 adults were treated for S. aureus bacteremia: 549 in the UK (21% methicillin-resistant), 80 in Vietnam (19% methicillin-resistant) and 1 in Nepal. In the UK, 41% had a removable infection focus (50% intravenous catheter-related), compared to 12% in Vietnam. Significantly (p<0.001) higher proportions of UK than Vietnamese patients had an echocardiogram (50% versus 28%), received more than 14 days antibiotic therapy (84% versus 44%), and received >50% of treatment with oral antibiotics alone (25% versus 4%). UK centres varied significantly (p<0.01) in the proportions given oral treatment alone for >50% of treatment (range 12-40%), in those treated for longer than 28 days (range 13-54%), and in those given combination therapy (range 14-94%). 24% died during admission: older age, time in hospital before bacteremia, and an unidentified infection focus were independent predictors of in-hospital death (p<0.001). CONCLUSIONS/SIGNIFICANCE: The management of S. aureus bacteremia varies widely between the UK and Vietnam and between centres in the UK with little adherence to published guidelines. Controlled trials defining optimal therapy are urgently required.

Pollock L, Else L, Poerksen G, Molyneux E, Moons P, Walker S, Fraser W, Back D, Khoo S. 2009. Pharmacokinetics of nevirapine in HIV-infected children with and without malnutrition receiving divided adult fixed-dose combination tablets. J Antimicrob Chemother, 64 (6), pp. 1251-1259. | Show Abstract | Read more

OBJECTIVES: To determine the relationship between nutritional status and nevirapine exposure by comparing the pharmacokinetics of nevirapine in HIV-infected children of different ages with and without malnutrition receiving divided tablets of Triomune 30 (stavudine + lamivudine + nevirapine) in accordance with Malawi National Guidelines. METHODS: Children were recruited in weight-based dosage bands and nutritional status classified according to weight for height. Total and unbound plasma nevirapine concentrations were measured over a full dosing interval. Multivariate linear and logistic regression analyses were performed to determine the effects of malnutrition, age, dose and other factors on nevirapine exposure and likelihood of achieving therapeutic nevirapine trough concentrations. RESULTS: Forty-three children were recruited (37 included for analysis). Mild to moderate malnutrition was present in 12 (32%) children; 25 (68%) were of normal nutritional status. There was no effect of malnutrition on any measure of total drug exposure or on the unbound fraction of nevirapine. Nevirapine exposure was strongly related to dose administered (P = 0.039) and to age (for every yearly increase in age there was an approximately 88% increase in the odds of achieving a therapeutic nevirapine concentration; P = 0.056, 95% confidence interval 0.983-3.585). CONCLUSIONS: Use of divided adult Triomune 30 tablets in treating young children results in significant underdosing. No independent effect of malnutrition on total and unbound nevirapine exposures was observed. These data support the use of bespoke paediatric antiretroviral formulations.

Puopolo M, Pocchiari M, Petrini C. 2009. Clinical trials and methodological problems in prion diseases. Lancet Neurol, 8 (9), pp. 782. | Read more

Marin B, Thiébaut R, Bucher HC, Rondeau V, Costagliola D, Dorrucci M, Hamouda O, Prins M, Walker S, Porter K et al. 2009. Non-AIDS-defining deaths and immunodeficiency in the era of combination antiretroviral therapy. AIDS, 23 (13), pp. 1743-1753. | Show Abstract | Read more

OBJECTIVE: To assess whether immunodeficiency is associated with the most frequent non-AIDS-defining causes of death in the era of combination antiretroviral therapy (cART). DESIGN: Observational multicentre cohorts. METHODS: Twenty-three cohorts of adults with estimated dates of human immunodeficiency virus (HIV) seroconversion were considered. Patients were seroconverters followed within the cART era. Measurements were latest CD4, nadir CD4 and time spent with CD4 cell count less than 350 cells/microl. Outcomes were specific causes of death using a standardized classification. RESULTS: Among 9858 patients (71 230 person-years follow-up), 597 died, 333 (55.7%) from non-AIDS-defining causes. Non-AIDS-defining infection, liver disease, non-AIDS-defining malignancy and cardiovascular disease accounted for 53% of non-AIDS deaths. For each 100 cells/microl increment in the latest CD4 cell count, we found a 64% (95% confidence interval 58-69%) reduction in risk of death from AIDS-defining causes and significant reductions in death from non-AIDS infections (32, 18-44%), end-stage liver disease (33, 18-46%) and non-AIDS malignancies (34, 21-45%). Non-AIDS-defining causes of death were also associated with nadir CD4 while being cART-naive or duration of exposure to immunosuppression. No relationship between risk of death from cardiovascular disease and CD4 cell count was found though there was a raised risk associated with elevated HIV RNA. CONCLUSION: In the cART era, the most frequent non-AIDS-defining causes of death are associated with immunodeficiency, only cardiovascular disease was associated with high viral replication. Avoiding profound and mild immunodeficiency, through earlier initiation of cART, may impact on morbidity and mortality of HIV-infected patients.

Collinge J, Gorham M, Hudson F, Kennedy A, Keogh G, Pal S, Rossor M, Rudge P, Siddique D, Spyer M et al. 2009. Safety and efficacy of quinacrine in human prion disease (PRION-1 study): a patient-preference trial. Lancet Neurol, 8 (4), pp. 334-344. | Show Abstract | Read more

BACKGROUND: The propagation of prions, the causative agents of Creutzfeldt-Jakob disease and other human prion diseases, requires post-translational conversion of normal cellular prion protein to disease-associated forms. The antimalarial drug quinacrine (mepacrine) prevents this conversion in vitro, and was given to patients with various prion diseases to assess its safety and efficacy in changing the course of these invariably fatal and untreatable diseases. METHODS: Patients with prion disease were recruited via the UK national referral system and were offered a choice between quinacrine (300 mg daily), no quinacrine, or randomisation to immediate quinacrine or deferred quinacrine in an open-label, patient-preference trial. The primary endpoints were death and serious adverse events possibly or probably related to the study drug. This study is registered, ISRCTN 06722585. FINDINGS: 107 patients with prion disease (45 sporadic, two iatrogenic, 18 variant, and 42 inherited) were enrolled, 23 in a pilot study and 84 in the main study. Only two patients chose randomisation; 40 took quinacrine during follow-up (37 who chose it at enrollment). Choice of treatment was associated with disease severity, with those least and most severely affected more likely to choose not to receive quinacrine. 78 (73%) patients died: one randomly assigned to deferred treatment, 26 of 38 who chose immediate quinacrine, and 51 of 68 who chose no quinacrine. Although adjusted mortality was lower in those who chose to take quinacrine than in those who did not, this was due to confounding with disease severity, and there was no difference in mortality between groups after adjustment. Four of 40 patients who took quinacrine had a transient response on neurological rating scales. Only two of 14 reported serious adverse events were judged quinacrine-related. INTERPRETATION: Quinacrine at a dose of 300 mg per day was reasonably tolerated but did not significantly affect the clinical course of prion diseases in this observational study.

Walker AS, Ford D, Mulenga V, Thomason MJ, Nunn A, Chintu C, Gibb DM, Bangsberg DR. 2009. Adherence to both cotrimoxazole and placebo is associated with improved survival among HIV-infected Zambian children. AIDS Behav, 13 (1), pp. 33-41. | Show Abstract | Read more

In the CHAP randomized placebo-controlled trial of cotrimoxazole prophylaxis in HIV-infected Zambian children conducted between 2001 and 2003, cotrimoxazole was associated with significant mortality reductions. In a secondary analysis we used Cox regression models to estimate the association between adherence measured by bottle weights and caregiver report and subsequent mortality in children surviving >28 days (n = 496, 153 deaths). Adherence was high and similar in both cotrimoxazole and placebo groups; adherence from bottle weights was 100% at 71% of visits, while caregivers reported 100% adherence at 79% of visits. Every 10% lower adherence to cotrimoxazole or placebo measured by bottle weights was associated with a 10-11% increase in mortality risk. Effects remained after adjustment for baseline predictors of survival and for current and recent change in primary caregiver. Caregiver-reported adherence was not associated with survival. The association between bottle-weight adherence to placebo and survival is likely capturing unmeasured caregiver effects, whose identification will be essential for quantifying the impact of antiretroviral therapy (ART) adherence on clinical outcomes in children.

Touloumi G, Pantazis N, Stirnadel HA, Walker AS, Boufassa F, Vanhems P, Porter K, CASCADE Collaboration. 2008. Rates and determinants of virologic and immunological response to HAART resumption after treatment interruption in HIV-1 clinical practice. J Acquir Immune Defic Syndr, 49 (5), pp. 492-498. | Show Abstract | Read more

OBJECTIVE: To describe CD4 and HIV RNA changes during treatment resumption (TR) after treatment interruption (TI) compared with response to first highly active antiretroviral therapy (HAART) and to investigate predictors. METHODS: Using Concerted Action on SeroConversion to AIDS and Death in Europe (CASCADE) data, we identified subjects who interrupted first HAART, not initiated during primary infection. We estimated rate of CD4 change during TR and time from TR to HIV RNA<500 copies per milliliter and subsequent rebound and factors associated with these outcomes. RESULTS: Of 281 persons treated for median 18.4 months before interrupting, 259 resumed HAART. CD4 increases in the first 3 months on HAART were similar pre-TI and post-TI but after 3 months were significantly higher during pre-TI HAART, with median +106 and +172 cells per microliter at 3 and 18 months, respectively, during initial HAART compared with +99 and +142 cells per microliter during post-TI HAART, respectively. Subjects with lower CD4 counts at TI, aged older than 40 years, and those resuming the same HAART as their pre-TI regimen had lower CD4 increases during the first 3 months of TR. The majority (86%) of individuals reinitiating therapy achieved HIV RNA<500 copies per milliliter. CONCLUSIONS: Immune reconstitution after TI is generally poorer than after first HAART, particularly for patients aged older than 40 years at TI and those with poorer immunological responses to pre-TI HAART. Reinitiation of the same HAART regimen as pre-TI also seems to have unfavorable outcomes.

Kekitiinwa A, Lee KJ, Walker AS, Maganda A, Doerholt K, Kitaka SB, Asiimwe A, Judd A, Musoke P, Gibb DM et al. 2008. Differences in factors associated with initial growth, CD4, and viral load responses to ART in HIV-infected children in Kampala, Uganda, and the United Kingdom/Ireland. J Acquir Immune Defic Syndr, 49 (4), pp. 384-392. | Show Abstract | Read more

BACKGROUND: Few studies have directly compared response to antiretroviral therapy (ART) between children living in well-resourced and resource-limited settings. In resource-limited settings non-HIV contributors could reduce the beneficial effects of ART. We compare predictors of short-term immunological, virological, and growth response to ART in HIV-infected children in the United Kingdom/Ireland and Kampala. METHODS: We analyzed prospective cohort data from 54 UK/Irish hospitals (the Collaborative HIV Paediatric Study) and Mulago Hospital, Kampala, Uganda. Six- and 12-month responses are described among children initiating combination ART (> or = 3 drugs, > or = 2 classes). Six months post-ART, predictors of viral load (VL) suppression <400 copies/mL, CD4% increases > 10%, and height- and weight-for-age z-score increases > or = +0.5 were investigated using logistic regression. RESULTS: In all, 582 UK/Irish children (76% black African) were younger than 876 Kampala children at ART initiation (median 5.0 vs 7.6 years), with higher CD4% (14%, 8%), lower VL (172,491 and 346,809 copies/mL), and less stunting (-0.8, -2.8) and wasting (-0.6, -2.8). Post-ART, median 12-month changes in the United Kingdom/Ireland and Kampala in CD4% (+12%, +13%) and weight (+0.4, +0.5) were similar, but growth was less in Kampala (+0.20, +0.06, P < 0.001). Younger children in both cohorts had better immunological, weight, and growth responses (all P < 0.001). However, lower pre-ART CD4% predicted better immunological response in the United Kingdom/Ireland but poorer response in Kampala (heterogeneity P = 0.004). Although 70% children in both cohorts had suppressed < 400 copies/mL at 6 months, adolescents starting ART in the United Kingdom/Ireland had somewhat poorer VL responses than those in Kampala (P = 0.15). CONCLUSIONS: Overall immunological and virologic ART responses were similar in children in both cohorts. Poorer CD4 recovery in more immunosuppressed Kampala children and blunted growth responses likely reflect higher background malnutrition and infection rates in Uganda, suggesting the need for earlier HIV diagnosis, nutritional support, cotrimoxazole prophylaxis, and ART.

Muyingo SK, Walker AS, Reid A, Munderi P, Gibb DM, Ssali F, Levin J, Katabira E, Gilks C, Todd J, DART Trial Team. 2008. Patterns of individual and population-level adherence to antiretroviral therapy and risk factors for poor adherence in the first year of the DART trial in Uganda and Zimbabwe. J Acquir Immune Defic Syndr, 48 (4), pp. 468-475. | Show Abstract | Read more

BACKGROUND: Good adherence is essential for successful antiretroviral therapy (ART) provision, but simple measures have rarely been validated in Africa. METHODS: This was an observational analysis of an open multicenter randomized HIV/AIDS management trial in Uganda and Zimbabwe. At 4-weekly clinic visits, ART drugs were provided and adherence measured through pill usage and questionnaire. Viral load response was assessed in a subset of patients. Drug possession ratio (percentage of drugs taken between visits) defined complete (100%) and good (>or=95%) adherence. RESULTS: In 2,957 patients, 90% had pill counts at every visit. Good adherence increased from 87%, 4 weeks after ART initiation, to 94% at 48 weeks, but only 1,454 (49%) patients achieved good adherence at every visit in the first year. Complete adherence was associated with 0.32 greater reduction in log10 viral load (95% confidence interval 0.05, 0.60 P = 0.02) and was independently associated with higher baseline CD4 count, starting ART later in the trial, reporting a single regular sexual partner, clinical center, and time on ART. CONCLUSIONS: Population level adherence improved over time suggesting an association with clinical experience. Most patients had at least one visit in the year on which they reported not having good adherence, showing the need for continued adherence interventions.

Burman WJ, Cotton MF, Gibb DM, Walker AS, Vernon AA, Donald PR. 2008. Ensuring the involvement of children in the evaluation of new tuberculosis treatment regimens. PLoS Med, 5 (8), pp. e176. | Show Abstract | Read more

We are on the threshold of revolutionary improvements in the treatment of tuberculosis. Within five to ten years, it is likely that highly effective three-month regimens will be available to treat both active and latent drug-susceptible tuberculosis. New drug classes that have the potential to dramatically improve the treatment of multidrug-resistant tuberculosis are entering clinical trials. Children have the same right as adults to benefit from research with these new treatments. By making a deliberate choice to avoid the path of least resistance, we can ensure that both adults and children benefit from these advances in tuberculosis treatment.




Sabin CA, Smith CJ, Monforte AD, Battegay M, Gabiano C, Galli L, Geelen S, Gibb D, Guiguet M, Judd A et al. 2008. Response to combination antiretroviral therapy: variation by age - The Collaboration of Observational HIV Epidemiological Research Europe (COHERE) study group AIDS, 22 (12), pp. 1463-1473.

Collaboration of Observational HIV Epidemiological Research Europe (COHERE) Study Group, Sabin CA, Smith CJ, d'Arminio Monforte A, Battegay M, Gabiano C, Galli L, Geelen S, Gibb D, Guiguet M et al. 2008. Response to combination antiretroviral therapy: variation by age. AIDS, 22 (12), pp. 1463-1473. | Show Abstract | Read more

OBJECTIVE: To provide information on responses to combination antiretroviral therapy in children, adolescents and older HIV-infected persons. DESIGN AND SETTING: Multicohort collaboration of 33 European cohorts. SUBJECTS: : Forty-nine thousand nine hundred and twenty-one antiretroviral-naive individuals starting combination antiretroviral therapy from 1998 to 2006. OUTCOME MEASURES: Time from combination antiretroviral therapy initiation to HIV RNA less than 50 copies/ml (virological response), CD4 increase of more than 100 cells/microl (immunological response) and new AIDS/death were analysed using survival methods. Ten age strata were chosen: less than 2, 2-5, 6-12, 13-17, 18-29, 30-39 (reference group), 40-49, 50-54, 55-59 and 60 years or older; those aged 6 years or more were included in multivariable analyses. RESULTS: The four youngest age groups had 223, 184, 219 and 201 individuals and the three oldest age groups had 2693, 1656 and 1613 individuals. Precombination antiretroviral therapy CD4 cell counts were highest in young children and declined with age. By 12 months, 53.7% (95% confidence interval: 53.2-54.1%) and 59.2% (58.7-59.6%) had experienced a virological and immunological response. The probability of virological response was lower in those aged 6-12 (adjusted hazard ratio: 0.87) and 13-17 (0.78) years, but was higher in those aged 50-54 (1.24), 55-59 (1.24) and at least 60 (1.18) years. The probability of immunological response was higher in children and younger adults and reduced in those 60 years or older. Those aged 55-59 and 60 years or older had poorer clinical outcomes after adjusting for the latest CD4 cell count. CONCLUSION: Better virological responses but poorer immunological responses in older individuals, together with low precombination antiretroviral therapy CD4 cell counts, may place this group at increased clinical risk. The poorer virological responses in children may increase the likelihood of emergence of resistance.

Bone I, Belton L, Walker AS, Darbyshire J. 2008. Intraventricular pentosan polysulphate in human prion diseases: an observational study in the UK. Eur J Neurol, 15 (5), pp. 458-464. | Show Abstract | Read more

UNLABELLED: BACKGROUND, PURPOSE AND METHODS: This observational study assessed the effect of continuous intraventricular infusion of pentosan polysulphate (PPS) in seven patients at different clinical centres in the UK. RESULTS: Complications of intraventricular catheterization were frequent. PPS was well-tolerated over a wide dose range (11-110 microg/kg/day) during the 6-month study. Four patients were assessed for the entire study period: one remained stable, two showed minimal deterioration and one progressed significantly. CONCLUSION: Mean survival of all patients was longer than reported values for natural history of specific prion disorders. Possible reasons for these findings are explored.

Reid A, Stöhr W, Walker AS, Williams IG, Kityo C, Hughes P, Kambugu A, Gilks CF, Mugyenyi P, Munderi P et al. 2008. Severe renal dysfunction and risk factors associated with renal impairment in HIV-infected adults in Africa initiating antiretroviral therapy. Clin Infect Dis, 46 (8), pp. 1271-1281. | Show Abstract | Read more

BACKGROUND: We sought to investigate renal function in previously untreated symptomatic human immunodeficiency virus (HIV)-infected adults with CD4(+) cell counts of <200 cells/mm(3) who were undergoing antiretroviral therapy (ART) in Africa. METHODS: The study was an observational analysis within a randomized trial of ART management strategies that included 3316 participants with baseline serum creatinine levels of < or =360 micromol/L. Creatinine levels were measured before ART initiation, at weeks 4 and 12 of therapy, and every 12 weeks thereafter. We calculated estimated glomerular filtration rate (eGFR) using the Cockcroft-Gault formula. We analyzed the incidence of severely decreased eGFR (<30 mL/min/1.73 m(2)) and changes in eGFR to 96 weeks, considering demographic data, type of ART, and baseline biochemical and hematological characteristics as predictors, using random-effects models. RESULTS: Sixty-five percent of the participants were women. Median values at baseline were as follows: age, 37 years; weight, 57 kg; CD4(+) cell count, 86 cells/mm(3); and eGFR, 89 mL/min/1.73 m(2). Of the participants, 1492 (45%) had mild (> or =60 but <90 mL/min/1.73 m(2)) and 237 (7%) had moderate (> or =30 but <60 mL/min/1.73 m(2)) impairments in eGFR. First-line ART regimens included zidovudine-lamivudine plus tenofovir disoproxil fumarate (for 74% of patients), nevirapine (16%), and abacavir (9%) (mostly nonrandomized allocation). After ART initiation, the median eGFR was 89-91 mL/min/1.73 m(2) for the period from week 4 through week 96. Fifty-two participants (1.6%) developed severe reductions in eGFR by week 96; there was no statistically significant difference between these patients and others with respect to first-line ART regimen received (P = .94). Lower baseline eGFR or hemoglobin level, lower body mass index, younger age, higher baseline CD4(+) cell count, and female sex were associated with greater increases in eGFR over baseline, with small but statistically significant differences between regimens (P < .001 for all). CONCLUSIONS: Despite screening, mild-to-moderate baseline renal impairment was relatively common, but these participants had greatest increases in eGFR after starting ART. Severe eGFR impairment was infrequent regardless of ART regimen and was generally related to intercurrent disease. Differences between ART regimens with respect to changes in eGFR through 96 weeks were of marginal clinical relevance, but investigating longer-term nephrotoxicity remains important.

Foster D, Knox K, Walker AS, Griffiths DT, Moore H, Haworth E, Peto T, Brueggemann AB, Crook DW, Oxford Invasive Pneumococcal Surveillance Group. 2008. Invasive pneumococcal disease: epidemiology in children and adults prior to implementation of the conjugate vaccine in the Oxfordshire region, England. J Med Microbiol, 57 (Pt 4), pp. 480-487. | Show Abstract | Read more

A 10-year invasive pneumococcal disease (IPD) enhanced surveillance project in the Oxfordshire region of the UK between 1996 and 2005 identified a total of 2691 Streptococcus pneumoniae isolates from all ages that provided a comprehensive description of pneumococcal epidemiology. All isolates were serotyped and those from children under 5 years of age were genotyped and a matched case-control study using adults hospitalized between 1995 and 2000 was performed to estimate the effectiveness of the pneumococcal polysaccharide vaccine in the local population. Fifty-one serotypes were isolated, with different age distributions. The overall incidence of IPD was 9.2 cases per 100 000 population per annum [95 % confidence interval (CI), 8.6-9.9] and that of meningitis was 0.7 per 100 000 population per annum (95 % CI 0.5-0.9). After adjusting for age, serotype 1 was found to be less likely to be associated with meningitis versus other IPD, compared with the most common serotype 14, whereas serotype 12F was more likely to cause meningitis than other IPD. There were significant temporal changes in IPD incidence of four serotypes, with decreases in serotypes 1, 12F and 14 and increases in serotype 8. A possible novel variant (from serotype 6A to 6B) was found using multilocus sequence typing analysis. From the matched case-control study of adults, the pneumococcal polysaccharide vaccine effectiveness was estimated to be 43 % (2-68 %), which did not change significantly after adjustment for pre-existing co-morbidities. The data provide a baseline against which the impact of the pneumococcal conjugate vaccine introduced in the UK in 2006 could be measured.

Ryan M, Griffin S, Chitah B, Walker AS, Mulenga V, Kalolo D, Hawkins N, Merry C, Barry MG, Chintu C et al. 2008. The cost-effectiveness of cotrimoxazole prophylaxis in HIV-infected children in Zambia. AIDS, 22 (6), pp. 749-757. | Show Abstract | Read more

OBJECTIVE: To assess the cost-effectiveness of cotrimoxazole prophylaxis in HIV-infected children in Zambia, as implementation at the local health centre level has yet to be undertaken in many resource-limited countries despite recommendations in recent updated World Health Organization (WHO) guidelines. DESIGN: A probabilistic decision analytical model of HIV/AIDS progression in children based on the CD4 cell percentage (CD4%) was populated with data from the placebo-controlled Children with HIV Antibiotic Prophylaxis trial that had reported a 43% reduction in mortality with cotrimoxazole prophylaxis in HIV-infected children aged 1-14 years. METHODS: Unit costs (US$ in 2006) were measured at University Teaching Hospital, Lusaka. Cost-effectiveness expressed as cost per life-year saved, cost per quality adjusted life-year (QALY) saved, cost per disability adjusted life-year (DALY) averted was calculated across a number of different scenarios at tertiary and primary healthcare centres. RESULTS: : Cotrimoxazole prophylaxis was associated with incremental cost-effectiveness ratios (ICERs) of US$72 per life-year saved, US$94 per QALY saved and US$53 per DALY averted, i.e. substantially less than a cost-effectiveness threshold of US$1019 per outcome (gross domestic product per capita, Zambia 2006). ICERs of US$5 or less per outcome demonstrate that cotrimoxazole prophylaxis is even more cost-effective at the local healthcare level. The intervention remained cost-effective in all sensitivity analyses including routine haematological and CD4% monitoring, varying starting age, AIDS status, cotrimoxazole formulation, efficacy duration and discount rates. CONCLUSION: Cotrimoxazole prophylaxis in HIV-infected children is an inexpensive low technology intervention that is highly cost-effective in Zambia, strongly supporting the adoption of WHO guidelines into essential healthcare packages in low-income countries.

L'homme RFA, Kabamba D, Ewings FM, Mulenga V, Kankasa C, Thomason MJ, Walker AS, Chintu C, Burger DM, Gibb DM. 2008. Nevirapine, stavudine and lamivudine pharmacokinetics in African children on paediatric fixed-dose combination tablets. AIDS, 22 (5), pp. 557-565. | Show Abstract | Read more

OBJECTIVE: Triomune Baby and Junior have been developed in response to the urgent need for appropriate paediatric fixed-dose combination antiretroviral tablets, with higher nevirapine to stavudine and lamivudine ratios than adult tablets, in accordance with paediatric recommendations. We determined whether this ratio results in optimal exposure in the target population. METHODS: Seventy-one Zambian children were treated with Triomune Baby or Junior dosed according to weight bands. After 4 weeks or more, a 12-h pharmacokinetic curve was recorded. Antiretroviral plasma concentrations were assayed by high-performance liquid chromatography. RESULTS: Six children were excluded because of poor adherence. Of the remaining 65, 24 (37%) were female, 24 (37%) weighed less than 15 kg and most were malnourished. Mean (range) nevirapine C12h, Cmax and AUC12h of 6.0 (1.4, 16.9) mg/l, 10.0 (3.8, 22.5) mg/l and 94.4 (32.1, 232) mg/l per hour were higher than those reported in adults. Nevirapine C12h was subtherapeutic (< 3.0 mg/l) in four children (6%). Mean stavudine and lamivudine C12h, Cmax, AUC12h (< 0.015 mg/l, 0.45 mg/l, 1.05 mg/l per hour and 0.09 mg/l, 1.33 mg/l, 5.42 mg/l per hour) were comparable to adults. There was no evidence of a difference in nevirapine AUC12h across weight bands (P = 0.2), whereas the difference in stavudine (P = 0.0003) and lamivudine AUC12h (P = 0.01) was driven by the single weight band with unequal dosing. CONCLUSION: Nevirapine concentrations were higher but more variable than in adults; the pharmacokinetic parameters of stavudine and lamivudine were comparable to adults. As nevirapine underdosing is of greater concern than overdosing, the Triomune Baby and Junior ratio appears to be appropriate for children weighing 6 kg and over. Further research is required for children under 6 kg.

Miller R, Esmail H, Peto T, Walker S, Crook D, Wyllie D. 2008. Is MRSA admission bacteraemia community-acquired? A case control study. J Infect, 56 (3), pp. 163-170. | Show Abstract | Read more

OBJECTIVES: To compare characteristics of methicillin resistant Staphylococcus aureus (MRSA) and methicillin susceptible S. aureus (MSSA) bacteraemia detected on admission to a UK hospital and to determine whether these organisms are community-acquired. METHODS: Consecutive cases of MRSA bacteraemia admitted to general medicine between 2003 and 2006 were identified and compared to MSSA age-matched and unmatched controls (35, 35 and 34 patients, respectively). Demographics, MRSA risk factors, previous health-care contact and clinical presentation were compared using patient notes. Multi-locus sequence typing was performed. RESULTS: 34/35 strains of admission MRSA bacteraemia were the health-care associated Sequence Types (ST)-22 (77%) or ST-36 (21%), whereas 20 different MSSA strains were identified. No MRSA cases fitted the CDC definition of community-acquired MRSA. Compatible with health-care associated acquisition, after matching for age MRSA cases had significantly higher levels of previous hospital exposure than MSSA controls, and more co-morbidities. Notably, 63% of MRSA cases were admitted from their own home, as opposed to secondary care facilities. Clinical presentation of MRSA and MSSA bacteraemias was similar. CONCLUSIONS: MRSA strains associated with health-care were responsible for almost all cases of MRSA bacteraemia on admission to hospital during the period studied. Despite this the majority of cases with MRSA admission bacteraemia were admitted from their own homes. Further research is needed into the determinants of MRSA bacteraemia among patients outside hospital.

Porter K, Walker S, Hill T, Anderson J, Leen C, Johnson M, Gazzard B, Walsh J, Fisher M, Orkin C et al. 2008. Changes in outcome of persons initiating highly active antiretroviral therapy at a CD4 count less than 50 Cells/mm3. J Acquir Immune Defic Syndr, 47 (2), pp. 202-205. | Show Abstract | Read more

BACKGROUND: Although HIV treatment guidelines recommend highly active antiretroviral therapy (HAART) initiation before reaching a CD4 count of 200 cells/mm3, many people in resource-rich settings, and a substantial proportion in resource-limited settings, present at levels <50 cells/mm3. METHODS: Using UK Collaborative HIV Cohort data, we assessed virologic response to HAART for antiretroviral-naive persons initiating therapy at a CD4 count <50 cells/mm3. We also investigated changes in the probability of having a viral level <400 copies/mL at 48 weeks over calendar time adjusting for gender, age, exposure category, ethnicity, baseline CD4 count and viral load, and whether the regimen contained a protease inhibitor. RESULTS: At 12, 24, 36, and 48 weeks, 80%, 83%, 85%, and 83% of participants, respectively, had a viral level <400 copies/mL. This proportion rose from 1997 to 1998, falling slightly in the most recent calendar period. By far the most important predictor of virologic suppression was calendar year of starting HAART (odds ratio [OR] = 2.49, 4.28, and 3.28 for 1999 to 2000, 2001 to 2002, and 2003 to 2005, respectively, compared with 1997 to 1998). Women were more likely to have a viral level <400 copies/mL at week 48 compared with men (OR = 1.74, 95% confidence interval [CI]: 1.07 to 3.02), as were older individuals (OR = 1.46, 95% CI: 1.11 to 1.96 for every 10 years older). There was marginal or no evidence that other factors were associated with outcome. The estimated corresponding probabilities of achieving a viral level <50 copies/mL at week 48 were 71%, 75%, and 79% for a woman aged 25, 35, and 45 years, respectively, initiating HAART in the most recent calendar period. The respective probabilities for a man at those ages were 68%, 73%, and 78%. CONCLUSIONS: These data, albeit under conditions of good infrastructure for care delivery, are a useful comparator for other populations starting therapy at similar levels of immunodeficiency and may be valuable for evaluating the success of antiretroviral therapy rollout programs.

Bhaskaran K, Mussini C, Antinori A, Walker AS, Dorrucci M, Sabin C, Phillips A, Porter K, CASCADE Collaboration. 2008. Changes in the incidence and predictors of human immunodeficiency virus-associated dementia in the era of highly active antiretroviral therapy. Ann Neurol, 63 (2), pp. 213-221. | Show Abstract | Read more

OBJECTIVE: Though effective anti-human immunodeficiency virus (HIV) therapies are now available, they have variable penetration into the brain. We therefore aimed to assess changes over calendar time in the risk for HIV-associated dementia (HIV-D), and factors associated with HIV-D risk. METHODS: Using Concerted Action on Seroconversion to AIDS and Death in Europe (CASCADE) data, we analyzed factors associated with time from HIV seroconversion to HIV-D using Cox models with time-updated covariates. The effect of duration of infection was explored using flexible parametric survival models. RESULTS: 222 of 15,380 seroconverters developed HIV-D. The incidence per 1,000 person-years was 6.49 pre-1997 (before highly active antiretroviral therapy was available), declining to 0.66 by 2003 to 2006. Compared with most recent CD4 count > or = 350 cells/mm3, the adjusted relative risk (95% confidence interval) of HIV-D was 3.47 (1.91-6.28), 10.19 (5.72-18.15), and 39.03 (22.96-66.36) at 200 to 349, 100 to 199, and 0 to 99 cells/mm3, respectively. In 2003 to 2006, older age at seroconversion (relative risk = 3.24 per 10-year increase [95% confidence interval, 2.00-5.24]) and previous acquired immune deficiency syndrome diagnosis (relative risk = 4.92 [95% confidence interval, 1.43-16.92]) were associated with HIV-D risk, independently of current CD4 count. HIV-D risk appeared to increase during chronic infection, by 48% at 10 years after seroconversion compared with the lowest risk at 1.8 years. INTERPRETATION: HIV-D incidence has reduced markedly since 1997. However, patients with low (<200 cells/mm3) or even intermediate (200-349 cells/mm3) CD4 counts, previous acquired immune deficiency syndrome diagnosis, longer HIV infection duration, and older age at seroconversion are at increased risk and should be closely monitored for neurocognitive disorders.

DART Trial Team. 2008. Fixed duration interruptions are inferior to continuous treatment in African adults starting therapy with CD4 cell counts < 200 cells/microl. AIDS, 22 (2), pp. 237-247. | Show Abstract | Read more

BACKGROUND: Structured treatment interruption (STI) of antiretroviral therapy (ART) could potentially reduce cost and toxicity, but clinical efficacy requires evaluation. METHODS: An assessment of fixed-duration STI was nested in DART, a multicentre trial comparing strategies for monitoring ART in Uganda and Zimbabwe (ISCRTN 13968779). Of 3316 ART-naive symptomatic adults with CD4 cell count < 200 cells/microl at ART initiation, 813 with > or = 300 cells/microl after 48 or 72 weeks underwent a second randomization to either STI, cycles of 12 weeks on/off (408), or continuous ART (CT; 405). RESULTS: Median age at STI/CT randomization was 37 years (range, 19-67) and CD4 cell count 358 cells/microl (range, 300-1054). A second review terminated the STI/CT randomisation on 15 March 2006, and participants changed to CT. Median follow-up was 51 weeks (range, 0-85): 99% and 50% of time was spent on ART in CT and STI, respectively. First new World Health Organization (WHO) stage 4 events or death occurred more frequently in STI (24; 6.4/100 person-years) than CT (9; 2.4/100 person-years) (hazard ratio, 2.73; 95% confidence interval, 1.27-5.88; P = 0.007); oesophageal candidiasis being the most frequent event (STI, 13; CT, 3). Nine (1%) participants died (STI, 5; CT, 4). There was no difference in time to first serious adverse event (P = 0.78), although ART change owing to toxicity occurred more with CT (10; 2.6/100 person-years) than with STI (2; 0.5/100 person-years) (P = 0.02). CONCLUSIONS: Although absolute rates of WHO stage 4 events/death were low, 12 week STIS initiated at a CD4 cell count >/= 300 cells/microl resulted in a greater than twofold increased relative rate of disease progression compared with continuous therapy in adult Africans initiating ART with advanced disease, and cannot be recommended.

Stöhr W, Walker AS, Munderi P, Tugume S, Gilks CF, Darbyshire JH, Hakim J, DART Trial. 2008. Estimating glomerular filtration rate in HIV-infected adults in Africa: comparison of Cockcroft-Gault and Modification of Diet in Renal Disease formulae. Antivir Ther, 13 (6), pp. 761-770. | Show Abstract

BACKGROUND: Cockcroft-Gault (CG) and Modification of Diet in Renal Disease (MDRD) formulae are recommended for glomerular filtration rate (GFR) estimation, but neither has been validated or directly compared longitudinally in HIV-infected patients or in Africa. METHODS: We investigated differences between formulae in baseline GFR, GFR changes and incidence of impaired GFR after initiation of antiretroviral therapy (ART) in 3,316 HIV-infected adults in Africa, considering sex, age, body mass index and baseline laboratory parameters as predictors. RESULTS: Participants were 65% women, median age 36.8 years, median weight 56.7 kg. Baseline GFR was lower using CG (median 89 ml/min/1.73 m2, 7.4% <60 ml/min/1.73 m2) versus MDRD (103 ml/min/1.73 m2, 3.1% <60 ml/min/1.73 m2). At 36 weeks, median CG-GFR increased (92 ml/min/1.73 m2), whereas MDRD-GFR decreased (96 ml/min/1.73 m2). Weight (explicitly a factor in CG only) concurrently increased to 62.0 kg. GFR changes from weeks 36-96 (after weight stabilization) were similar across formulae. By 96 weeks, 56 patients developed severe GFR impairment (<30 ml/min/1.73 m2) using one or both formulae (both n=45, CG n=7, MDRD n=4) compared with only 24 by serum creatinine alone. Multivariate models identified different sets of predictors for each formula. CONCLUSIONS: Although severe GFR impairments are similarly classified by different formulae, moderate impairments were more frequently identified using CG-GFR versus MDRD-GFR (with Black ethnicity correction factor 1.21), and creatinine alone had low sensitivity. Given overestimation in underweight patients and sensitivity to weight changes, this MDRD formula might not necessarily be superior for monitoring ART in African HIV-infected adults.

Thiébaut R, Walker S. 2008. When it is better to estimate a slope with only one point. QJM, 101 (10), pp. 821-824. | Show Abstract | Read more

When investigating the change in a biomarker, it is often believed that at least two measurements are needed from each participant, and that those with only one measurement should be excluded. In this short note, we explain why this could lead to imprecise and biased estimates. Furthermore, we discuss a standard statistical method that handles such issues.

Walker S, Peto TEA, O'Connor L, Crook DW, Wyllie D. 2008. Are there better methods of monitoring MRSA control than bacteraemia surveillance? An observational database study. PLoS One, 3 (6), pp. e2378. | Show Abstract | Read more

BACKGROUND: Despite a substantial burden of non-bacteraemic methicillin resistant Staphylococcus aureus (MRSA) disease, most MRSA surveillance schemes are based on bacteraemias. Using bacteraemia as an outcome, trends at hospital level are difficult to discern, due to random variation. We investigated rates of nosocomial bacteraemic and non-bacteraemic MRSA infection as surveillance outcomes. METHODS AND FINDINGS: We used microbiology and patient administration system data from an Oxford hospital to estimate monthly rates of first nosocomial MRSA bacteraemia, and nosocomial MRSA isolation from blood/respiratory/sterile site specimens ("sterile sites") or all clinical samples (screens excluded) in all patients admitted from the community for at least 2 days between April 1998 and June 2006. During this period there were 441 nosocomial MRSA bacteraemias, 1464 MRSA isolations from sterile sites, and 3450 isolations from clinical specimens (8% blood, 15% sterile site, 10% respiratory, 59% surface swabs, 8% urine) in over 2.6 million patient-days. The ratio of bacteraemias to sterile site and all clinical isolations was similar over this period (around 3 and 8-fold lower respectively), during which rates of nosocomial MRSA bacteraemia increased by 27% per year to July 2003 before decreasing by 18% per year thereafter (heterogeneity p<0.001). Trends in sterile site and all clinical isolations were similar. Notably, a change in rate of all clinical MRSA isolations in December 2002 could first be detected with conventional statistical significance by August 2003 (p = 0.03). In contrast, when monitoring MRSA bacteraemia, identification of probable changes in trend took longer, first achieving p<0.05 in July 2004. CONCLUSIONS: MRSA isolation from all sites of suspected infection, including bacteraemic and non-bacteraemic isolation, is a potential new surveillance method for MRSA control. It occurs about 8 times more frequently than bacteraemia, allowing robust statistical determination of changing rates over substantially shorter times or smaller areas than using bacteraemia as an outcome.

Walker AS, Spiegelhalter D, Crook DW, Wyllie D, Morris J, Peto TEA. 2008. Fairness of financial penalties to improve control of Clostridium difficile. BMJ, 337 (nov20 2), pp. a2097. | Read more

Dart Trial Team. 2008. Twenty-four-week safety and tolerability of nevirapine vs. abacavir in combination with zidovudine/lamivudine as first-line antiretroviral therapy: a randomized double-blind trial (NORA). Trop Med Int Health, 13 (1), pp. 6-16. | Show Abstract | Read more

OBJECTIVE: To compare the safety/tolerability of abacavir and nevirapine in HIV-infected adults starting antiretroviral (ARV) therapy in Uganda. METHODS: Twenty-four-week randomized double-blind trial conducted with 600 symptomatic ARV-naive adults with CD4 <200 cells/mm(3) allocated to zidovudine/lamivudine plus 300 mg abacavir (A) and nevirapine placebo (n = 300) or 200 mg nevirapine (N) and abacavir placebo (n = 300) twice daily. The primary endpoint was any serious adverse event (SAE) definitely/probably or uncertain whether related to blinded nevirapine/abacavir. Secondary endpoints were adverse events leading to permanent discontinuation of blinded nevirapine/abacavir, and grade 4 events. RESULTS: Seventy-two per cent participants were women; 19% had WHO stage 4 disease; the median age was 37 years (range 18-66); the median baseline CD4 count was 99 cells/mm(3) (1-199). Ninety-five per cent completed 24 weeks: 4% died and 1% were lost to follow-up. Thirty-seven SAEs occurred on blinded drug in 36 participants. Twenty events [6 (2.0%) abacavir, 14 (4.7%) nevirapine participants] were considered serious adverse reactions definitely/probably/uncertain whether related to blinded abacavir/nevirapine [HR = 0.42 (95% CI 0.16-1.09) P = 0.06]. Only 2.0% of abacavir participants [six patients (0.7-4.3%)] experienced a suspected hypersensitivity reaction (HSR). In total 14 (4.7%) abacavir and 30 (10.0%) nevirapine participants discontinued blinded abacavir/nevirapine (P = 0.02): because of toxicity (6A, 15N; P = 0.07, all rash/possible HSR and/or hepatotoxicity), anti-tuberculosis therapy (6A, 13N), or for other reasons (2A, 2N). CONCLUSIONS: There was a trend towards a lower rate of serious adverse reactions in Ugandan adults with low CD4 starting ARV regimens with abacavir than with nevirapine. This suggests that abacavir could be used more widely in resource-limited settings without major safety concerns.

Wyllie DH, Walker AS, Peto TEA, Crook DW. 2007. Hospital exposure in a UK population, and its association with bacteraemia. J Hosp Infect, 67 (4), pp. 301-307. | Show Abstract | Read more

Despite the importance of healthcare-associated infection, few studies have quantified the association between severe infection and hospital exposure in UK populations. Our aim was to estimate the proportion of the population with recent hospital admission, together with rates of infection in hospital-exposed and hospital-naïve populations. We studied bacteraemia as a marker of severe infection in a population of 550,000, served by two hospitals, between 1 April 2000 and 31 March 2005. Hospital-exposed persons accounted for 8.3% of the population, defined as having been resident in a hospital in the last year. The hospital-exposed population accounted for 55% of all admissions, and 42% of emergency admissions to medical, paediatric or surgery departments. After adjustment for age, the hospital-exposed group had much higher rates of admission bacteraemia. Age-standardised incidence rate ratios relative to hospital-naïve patients were 43 [95% confidence interval (CI): 22-85] for meticillin-resistant Staphylococcus aureus (MRSA), 20 (15-27) for S. aureus other than MRSA, 7.3 (5.2-10) for Streptococcus pneumoniae, and 14 (11-18) for E. coli. MRSA was common among hospital-exposed admissions, including emergencies in hospital-exposed men, rates of admission MRSA bacteraemia (31 per 100,000 per annum) and S. pneumoniae bacteraemia (33 per 100,000 per annum) were similar. This quantitative analysis confirms that prior hospital admission is a major risk factor for bacteraemia on hospital admission; it is unclear whether acquisition of pathogens in hospital, co-morbidity or other factors explain this.

Walker AS, Mulenga V, Ford D, Kabamba D, Sinyinza F, Kankasa C, Chintu C, Gibb DM, CHAP Team. 2007. The impact of daily cotrimoxazole prophylaxis and antiretroviral therapy on mortality and hospital admissions in HIV-infected Zambian children. Clin Infect Dis, 44 (10), pp. 1361-1367. | Show Abstract | Read more

BACKGROUND: Data on the population effectiveness of cotrimoxazole prophylaxis and antiretroviral therapy (ART) in human immunodeficiency virus (HIV)-infected African children are few. METHODS: A total of 534 Zambian children with HIV infection were randomized to receive daily cotrimoxazole prophylaxis or placebo in the Children with HIV Antibiotic Prophylaxis trial. Following trial closure, children who received the placebo initiated cotrimoxazole prophylaxis, and all children were observed in a closed cohort. Mortality and hospital admission rates were compared, over calendar time, in 9-month periods: trial recruitment (March 2001 to April 2002, May 2002 to January 2003), trial follow-up to closure (February 2003 to October 2003), initial follow-up posttrial (November 2003 to July 2004), and early and later ART availability (August 2004 to April 2005, and May 2005 to May 2006, respectively). RESULTS: A total of 546 child-years of follow-up, 40 deaths, and 80 hospital admissions were observed between the time of trial closure and June 2006. A total of 117 of 283 children who were alive at trial closure received ART in the posttrial period (median child age at first use of ART, 8.8 years). Rates decreased in both groups during the trial period, suggesting a survivorship effect. Mortality and hospital admission rates before trial closure were 14 (95% confidence interval [CI], 9-21) deaths per 100 child-years and 24 (95% CI, 15-39) hospital admissions per 100 child-years, respectively, for children who were receiving cotrimoxazole, and were 23 (95% CI, 16-34) deaths per 100 child-years and 35 (95% CI, 23-53) hospital admissions per 100 child-years, respectively, for children who were receiving the placebo. After trial closure, rates remained stable in the cotrimoxazole group, but decreased to 15 (95% CI, 8-26) deaths per 100 child-years and 19 (95% CI, 10-41) hospital admissions per 100 child-years, respectively, for the group of children who received placebo and then initiated cotrimoxazole prophylaxis. In both groups combined, mortality rates decreased to 6 (95% CI, 3-11) deaths per 100 child-years and then 2 (95% CI, 0.8-6) deaths per 100 child-years during periods of ART availability; hospital admission rates decreased to 17 (95% CI, 11-27) hospital admissions per 100 child-years and 8 (95% CI, 4-15) hospital admissions per 100 child-years, respectively. CONCLUSION: The benefits of once-daily cotrimoxazole prophylaxis continued throughout the trial and after trial closure. Mortality and hospital admissions decreased (by approximately 6-fold and approximately 3-fold, respectively) following ART availability, similar to findings observed in resource-rich countries.

Green H, Gibb DM, Walker AS, Pillay D, Butler K, Candeias F, Castelli-Gattinara G, Compagnucci A, Della Negra M, de Rossi A et al. 2007. Lamivudine/abacavir maintains virological superiority over zidovudine/lamivudine and zidovudine/abacavir beyond 5 years in children. AIDS, 21 (8), pp. 947-955. | Show Abstract | Read more

OBJECTIVE: To describe the long-term efficacy over 5 years of regimens including combinations of abacavir, lamivudine and/or zidovudine in previously untreated children in the PENTA 5 trial. DESIGN: PENTA 5 was a 48-week randomised controlled trial comparing three dual nucleoside reverse transcriptase inhibitor (NRTI) combinations as part of first triple antiretroviral therapy (ART). METHODS: 128 ART-naïve children were randomised to zidovudine\lamivudine (n = 36), zidovudine\abacavir (45) or lamivudine\abacavir (47). Asymptomatic children (n = 55) were also randomised to nelfinavir or placebo; all other children received open-label nelfinavir. Analyses are intent-to-treat and adjusted for minor baseline imbalances and receipt of nelfinavir/placebo. RESULTS: Median follow-up was 5.8 years. By 5 years, 17 (47%), 28 (64%) and 18 (39%) children had changed their randomised NRTIs in the zidovudine\lamivudine, zidovudine\abacavir and lamivudine\abacavir groups respectively, but 18%, 50% and 50% of these changes were either early single drug substitutions for toxicity or switches with viral suppression (HIV-1 RNA < 400 copies/ml; e.g. to simplify regimen delivery). At 5 years, 55%/32% zidovudine\lamivudine, 50%/25% zidovudine\abacavir and 79%/63% lamivudine\abacavir had HIV-1 RNA < 400/< 50 copies/ml respectively (p = 0.03/p = 0.003). Mean increase in height-for-age 0.42, 0.68, 1.05 (p = 0.02); weight-for-age 0.03, 0.13, 0.75 (p = 0.02). Reverse transcriptase resistance mutations emerging on therapy differed between the groups: zidovudine\lamivudine (M41L, D67N, K70R, M184V, L210W, T215Y); zidovudine\abacavir (M41L, D67N, K70R, L210W, T215F/Y, K219Q); lamivudine\abacavir (K65R, L74V, Y115F, M184V). CONCLUSIONS: Five year data demonstrate that lamivudine\abacavir is more effective in terms of HIV-1 RNA suppression and growth changes, with lower rates of switching with detectable HIV-1 RNA than zidovudine\lamivudine or zidovudine\abacavir, and should be preferred as first-line NRTI backbone.

Bond SJ, White IR, Sarah Walker A. 2007. Instrumental variables and interactions in the causal analysis of a complex clinical trial. Stat Med, 26 (7), pp. 1473-1496. | Show Abstract | Read more

We consider the application of instrumental variable techniques in a longitudinal clinical trial in paediatric HIV/AIDS, with a substantial degree of non-compliance to randomized treatment (Nelfinavir versus placebo) and with left censoring of the outcome variable (HIV RNA concentration). We consider in detail the assumptions and implications behind the inclusion and exclusion of interactions between randomized arm and baseline covariates in modelling actual treatment received, and between treatment and baseline covariates in modelling outcome. Estimated treatment effects were sensitive to inclusion of interactions, and we show how such sensitivity can be explored and explained.

Kikaire B, Khoo S, Walker AS, Ssali F, Munderi P, Namale L, Reid A, Gibb DM, Mugyenyi P, Grosskurth H, DART Trial Team. 2007. Nevirapine clearance from plasma in African adults stopping therapy: a pharmacokinetic substudy. AIDS, 21 (6), pp. 733-737. | Show Abstract | Read more

OBJECTIVE: To measure nevirapine elimination in African adults undertaking a structured treatment interruption (STI) in the DART trial. DESIGN: Cohort (16 women, 5 men; median weight 61 kg) within a randomized trial of management strategies. METHODS: Plasma nevirapine was measured by validated high performance liquid chromatography at 0,1,2,3 and 4 weeks after stopping the drug in a subset of patients undertaking an STI. All patients continued lamivudine plus zidovudine/stavudine for a further 7 days. RESULTS: Two patients with no or low plasma nevirapine concentration at baseline were excluded. Geometric mean plasma concentration when nevirapine was stopped in the remaining 19 patients was 6421 ng/ml (range, 3724-9473). Nevirapine was detected in 15/18 (83%) patients at 1 week, and 5/19 (26%) patients at 2 weeks but was not found any samples collected after 2 weeks. Only one patient had > 100 ng/ml (limit of quantification) at 2 weeks (415 ng/ml, female). The median times to reach thresholds of 200, 100 and 20 ng/ml (limit of detection) were estimated to be 7.6 [interquartile range (IQR), 7.0-10.1], 9.3 (IQR, 8.7-13.0) and 13.2 (IQR, 12.3-18.4) days, respectively, with 3/19 (16%) and 14/19 (74%) estimated to have reached < 20 ng/ml by 7 and 14 days, respectively. CONCLUSION: Although elimination of nevirapine was faster than previously published after a single dose, the data suggest that an additional staggered period of 7-10 days with dual nucleotide reverse transcriptase inhibitor cover is necessary for African patients discontinuing nevirapine.

Lee KJ, Shingadia D, Pillay D, Walker AS, Riordan A, Menson E, Duong T, Tudor-Williams G, Gibb DM, Collaborative HIV Paediatric Study Steering Committee. 2007. Transient viral load increases in HIV-infected children in the U.K. and Ireland: what do they mean? Antivir Ther, 12 (6), pp. 949-956. | Show Abstract

OBJECTIVES: To investigate transient increases in viral load during sustained suppression in children in the UK and Ireland Collaborative HIV Paediatric Study (CHIPS). DESIGN: Cohort of HIV-infected children from 39 centres. METHODS: Transient viraemia was defined as > or =1 detectable viral loads (> or =50 copies/ml) between two undetectable values (<50 copies/ml) <280 days apart, during a period of sustained viral suppression (from a confirmed level of <50 copies/ml until the last undetectectable measurement before antiretroviral therapy change or until a confirmed level of >50 copies/ml). RESULTS: Of 595 children initiating HAART without previous treatment, 347 (58%) achieved sustained suppression. Of these, 78 (23%) experienced 109 episodes of transient viraemia (median 134 copies/ml); 92 (84%) had levels of <1000 copies/ml (maximum 39,839). Transient viraemia was more common during second-line therapy (25/100 child-years [CY]) and following a previous episode (19/100 CY) compared with first-line therapy without a previous episode (11/100 CY). Rates decreased with age at HAART initiation (incidence rate ratio [IRR] 0.95 per year older; P = 0.05), but were higher in those suppressed for longer (IRR 1.63 in those suppressed for 21 year versus <1 year; P = 0.03). CD4+ and CD8+ T-cell counts were similar before and after transient viraemia. Of detectable viral loads during periods of suppression 44% were transient increases rather than virological failure: experiencing transient viraemia did not increase subsequent virological failure (P = 0.20). CONCLUSIONS: Transient viraemia is relatively common among children on HAART, occurring more frequently in those starting HAART at younger ages, on second-line therapy and after longer suppression. It does not appear to affect CD4+ or CD8+ T-cell counts or the risk of subsequent virological failure. Natural variation, assay effects and adherence might all have a role.

Ellis JC, L'homme RFA, Ewings FM, Mulenga V, Bell F, Chileshe R, Molyneux E, Abernethy J, van Oosterhout JJG, Chintu C et al. 2007. Nevirapine concentrations in HIV-infected children treated with divided fixed-dose combination antiretroviral tablets in Malawi and Zambia. Antivir Ther, 12 (2), pp. 253-260. | Show Abstract

OBJECTIVE: To investigate nevirapine concentrations in African HIV-infected children receiving divided Triomune tablets (stavudine+lamivudine+nevirapine). DESIGN: Cross-sectional study. METHODS: Steady-state plasma nevirapine concentrations were determined in Malawian and Zambian children aged 8 months to 18 years receiving Triomune in routine outpatient settings. Predictors from height-for-age, body mass index (BMI)-for-age, age, sex, post-dose sampling time and dose/m2/day were investigated using centre-stratified regression with backwards elimination (P<0.1). RESULTS: Of the 71 Malawian and 56 Zambian children (median age 8.4 vs 8.5 years, height-for-age -3.15 vs -1.84, respectively), only 1 (3%) of those prescribed > or =300 mg/m2/day nevirapine had subtherapeutic concentrations (<3 mg/l) compared with 22 (23%) of those prescribed <300 mg/m2/day; most children with subtherapeutic nevirapine concentrations were taking half or quarter Triomune tablets. Lower nevirapine concentrations were independently associated with lower height-for-age (indicating stunting) (0.37 mg/l per unit higher [95% confidence interval (CI): -0.003, +0.74]; P=0.05), lower prescribed dose/m2 (+0.89 mg/l per 50 mg/m2 higher [95% CI: 0.32, 1.46]; P=0.002) and higher BMI-for-age (indicating lack of wasting) (-0.42 mg/l per unit higher [95% CI: -0.80, -0.04]; P=0.03). CONCLUSIONS: Currently available adult fixed-dose combination tablets are not well suited to children, particularly at younger ages: Triomune 30 is preferable to Triomune 40 because of the higher dose of nevirapine relative to stavudine. Further research is required to confirm that concentrations are reduced in stunted children but increased in wasted children. Development of appropriate paediatric fixed-dose combination tablets is essential if antiretroviral therapy is to be made widely available to children in resource-limited settings.

Mulenga V, Ford D, Walker AS, Mwenya D, Mwansa J, Sinyinza F, Lishimpi K, Nunn A, Gillespie S, Zumla A et al. 2007. Effect of cotrimoxazole on causes of death, hospital admissions and antibiotic use in HIV-infected children. AIDS, 21 (1), pp. 77-84. | Show Abstract | Read more

BACKGROUND: Cotrimoxazole prophylaxis reduces morbidity and mortality in HIV-1-infected children, but mechanisms for these benefits are unclear. METHODS: CHAP was a randomized trial comparing cotrimoxazole prophylaxis with placebo in HIV-infected children in Zambia where background bacterial resistance to cotrimoxazole is high. We compared causes of mortality and hospital admissions, and antibiotic use between randomized groups. RESULTS: Of 534 children (median age, 4.4 years; 32% 1-2 years), 186 died and 166 had one or more hospital admissions not ending in death. Cotrimoxazole prophylaxis was associated with lower mortality, both outside hospital (P = 0.01) and following hospital admission (P = 0.005). The largest excess of hospital deaths in the placebo group was from respiratory infections [22/56 (39%) placebo versus 10/35 (29%) cotrimoxazole]. By 2 years, the cumulative probability of dying in hospital from a serious bacterial infection (predominantly pneumonia) was 7% on cotrimoxazole and 12% on placebo (P = 0.08). There was a trend towards lower admission rates for serious bacterial infections in the cotrimoxazole group (19.1 per 100 child-years at risk versus 28.5 in the placebo group, P = 0.09). Despite less total follow-up due to higher mortality, more antibiotics (particularly penicillin) were prescribed in the placebo group in year one [6083 compared to 4972 days in the cotrimoxazole group (P = 0.05)]. CONCLUSIONS: Cotrimoxazole prophylaxis appears to mainly reduce death and hospital admissions from respiratory infections, supported further by lower rates of antibiotic prescribing. As such infections occur at high CD4 cell counts and are common in Africa, the role of continuing cotrimoxazole prophylaxis after starting antiretroviral therapy requires investigation.

Walker AS, Mulenga V, Sinyinza F, Lishimpi K, Nunn A, Chintu C, Gibb DM, CHAP Trial Team. 2006. Determinants of survival without antiretroviral therapy after infancy in HIV-1-infected Zambian children in the CHAP Trial. J Acquir Immune Defic Syndr, 42 (5), pp. 637-645. | Show Abstract | Read more

BACKGROUND: There are few data on predictors of HIV progression in untreated children in resource-limited settings. METHODS: Children with HIV Antibiotic Prophylaxis (CHAP) was a randomized trial comparing cotrimoxazole prophylaxis with placebo in HIV-infected Zambian children. The prognostic value of baseline characteristics was investigated using Cox models. RESULTS: Five hundred fourteen children aged 1 to 14 (median 5.5) years contributed 607 years follow-up (maximum 2.6 years). Half were boys, and in 67%, the mother was the primary carer; at baseline, median CD4 percentage was 11% and weight was less than third percentile in 67%. One hundred sixty-five children died (27.2 per 100 years at risk; 95% confidence interval 23.3-31.6). Low weight-for-age, CD4 percentage, hemoglobin, mother as primary carer, current malnutrition, and previous hospital admissions for respiratory tract infections or recurrent severe bacterial infections were independent predictors of poorer survival, whereas oral candidiasis predicted poorer survival only when baseline CD4 percentage was not considered. Mortality rates per 100 child years of 44.5 (37.2-53.2), 14.7 (10.9-19.8), and 2.3 (0.3-16.7) were associated with new World Health Organization stages 4, 3, and 1/2, respectively, applied retrospectively; very low weight-for-age was the only staging feature for 42% of stage 4 children. CONCLUSIONS: Malnutrition and hospitalizations for respiratory/bacterial infections predict mortality independent of immunosuppression, suggesting that they capture HIV- and non-HIV-related mortality, whereas oral candidiasis is a proxy for immunosuppression.

Touloumi G, Pantazis N, Antoniou A, Stirnadel HA, Walker SA, Porter K, CASCADE Collaboration. 2006. Highly active antiretroviral therapy interruption: predictors and virological and immunologic consequences. J Acquir Immune Defic Syndr, 42 (5), pp. 554-561. | Show Abstract | Read more

OBJECTIVE: To characterize the magnitude and the predictors of highly active antiretroviral therapy (HAART) interruption (TI) and to investigate its immunologic and virological consequences. METHODS: Using Concerted Action on Seroconversion to AIDS and Death in Europe data from 8,300 persons with well-documented seroconversion dates, we identified subjects with stable first HAART (for at least 90 days) not initiated during primary infection. A TI was defined as an interruption of all antiretroviral therapy drugs for at least 14 days. RESULTS: Of 1,551 subjects starting HAART, 299 (19.3%) interrupted treatment. Median (interquartile range) duration of the TI was 189 (101-382) days. The cumulative probability (95% confidence interval) of TI at 2 years was 15.9% (14.0%-18.1%). Women were more likely to have a TI than men in the same exposure group (35.8% vs 24.2% among drug users, 22.1% vs 13.3% among heterosexuals; P < 0.05). Higher baseline viremia and poor immunologic response to HAART were associated with higher probabilities of TI. Median (interquartile range) individual CD4 cell loss during TI was 94 (1-220) cells/microL. Older age at HAART (>40 yr), lower pre-HAART nadir (<200 cells/microL), and lower CD4 at start of TI (<350 cells/microL) were significantly associated with greater relative CD4 loss during TI. CONCLUSIONS: We estimate that almost 1 in 6 subjects on HAART interrupts treatment by 2 years. Further research is needed to investigate the reasons why TI is higher in women. We have identified characteristics of subjects with the greatest risk for CD4 loss in whom TI may have greater risks.

DART Virology Group and Trial Team. 2006. Virological response to a triple nucleoside/nucleotide analogue regimen over 48 weeks in HIV-1-infected adults in Africa. AIDS, 20 (10), pp. 1391-1399. | Show Abstract | Read more

OBJECTIVES: To evaluate virologic response up to 48 weeks, and emergence of HIV-1 resistance mutations at 24 weeks, in therapy-naive adults initiating zidovudine/lamivudine/tenofovir DF. DESIGN: : A cohort within the DART trial. METHODS: Plasma HIV-1 RNA was assayed in 300 adults with baseline CD4