Journal: PLoS Med

Sorted by: date / impact
Abstract

Financial incentives and deposit contracts to promote HIV retesting in Uganda: A randomized trial.

Chamie G, Kwarisiima D, Ndyabakira A, Marson K, ... Kamya MR, Thirumurthy H
Background
Frequent retesting for HIV among persons at increased risk of HIV infection is critical to early HIV diagnosis of persons and delivery of combination HIV prevention services. There are few evidence-based interventions for promoting frequent retesting for HIV. We sought to determine the effectiveness of financial incentives and deposit contracts in promoting quarterly HIV retesting among adults at increased risk of HIV.
Methods and findings
In peri-urban Ugandan communities from October to December 2018, we randomized HIV-negative adults with self-reported risk to 1 of 3 strategies to promote HIV retesting: (1) no incentive; (2) cash incentives (US$7) for retesting at 3 and 6 months (total US$14); or (3) deposit contracts: participants could voluntarily deposit US$6 at baseline and at 3 months that would be returned with interest (total US$7) upon retesting at 3 and 6 months (total US$14) or lost if participants failed to retest. The primary outcome was retesting for HIV at both 3 and 6 months. Of 1,482 persons screened for study eligibility following community-based recruitment, 524 participants were randomized to either no incentive (N = 180), incentives (N = 172), or deposit contracts (N = 172): median age was 25 years (IQR: 22 to 30), 44% were women, and median weekly income was US$13.60 (IQR: US$8.16 to US$21.76). Among participants randomized to deposit contracts, 24/172 (14%) made a baseline deposit, and 2/172 (1%) made a 3-month deposit. In intent-to-treat analyses, HIV retesting at both 3 and 6 months was significantly higher in the incentive arm (89/172 [52%]) than either the control arm (33/180 [18%], odds ratio (OR) 4.8, 95% CI: 3.0 to 7.7, p < 0.001) or the deposit contract arm (28/172 [16%], OR 5.5, 95% CI: 3.3 to 9.1, p < 0.001). Among those in the deposit contract arm who made a baseline deposit, 20/24 (83%) retested at 3 months; 11/24 (46%) retested at both 3 and 6 months. Among 282 participants who retested for HIV during the trial, three (1%; 95%CI: 0.2 to 3%) seroconverted: one in the incentive group and two in the control group. Study limitations include measurement of retesting at the clinic where baseline enrollment occurred, only offering clinic-based (rather than community-based) HIV retesting and lack of measurement of retesting after completion of the trial to evaluate sustained retesting behavior.
Conclusions
Offering financial incentives to high-risk adults in Uganda resulted in significantly higher HIV retesting. Deposit contracts had low uptake and overall did not increase retesting. As part of efforts to increase early diagnosis of HIV among high-risk populations, strategic use of incentives to promote retesting should receive greater consideration by HIV programs.
Trial registration
clinicaltrials.gov: NCT02890459.



PLoS Med: 03 May 2021; 18:e1003630
Chamie G, Kwarisiima D, Ndyabakira A, Marson K, ... Kamya MR, Thirumurthy H
PLoS Med: 03 May 2021; 18:e1003630 | PMID: 33945526
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Tranexamic acid and bleeding in patients treated with non-vitamin K oral anticoagulants undergoing dental extraction: The EXTRACT-NOAC randomized clinical trial.

Ockerman A, Miclotte I, Vanhaverbeke M, Vanassche T, ... Politis C, Verhamme P
Background
Oral bleeding after dental extraction in patients on non-vitamin K oral anticoagulants (NOACs) is a frequent problem. We investigated whether 10% tranexamic acid (TXA) mouthwash decreases post-extraction bleeding in patients treated with NOACs.
Methods and findings
The EXTRACT-NOAC study is a randomized, double-blind, placebo-controlled, multicenter, clinical trial. Patients were randomly assigned to 10% TXA or placebo mouthwash and were instructed to use the mouthwash prior to dental extraction, for 3 times a day for 3 days thereafter. The primary outcome was the number of patients with any post-extraction oral bleeding up to day 7. Secondary outcomes included the periprocedural, early, and delayed bleeding, and the safety outcomes included all thrombotic events. The first patient was randomized on February 9, 2018 and the last patient on March 12, 2020. Of 222 randomized patients, 218 patients were included in the full analysis set, of which 106 patients were assigned to TXA (74.8 (±8.8) years; 81 men) and 112 to placebo (72.7 (±10.7) years; 64 men). Post-extraction bleeding occurred in 28 (26.4%) patients in the TXA group and in 32 (28.6%) patients in the placebo group (relative risk, 0.92; 95% confidence interval [CI], 0.60 to 1.42; P = 0.72). There were 46 bleeds in the TXA group and 85 bleeds in the placebo group (rate ratio, 0.57; 95% CI, 0.31 to 1.05; P = 0.07). TXA did not reduce the rate of periprocedural bleeding (bleeding score 4 ± 1.78 versus 4 ± 1.82, P = 0.80) and early bleeding (rate ratio, 0.76; 95% CI, 0.42 to 1.37). Delayed bleeding (rate ratio, 0.32; 95% CI, 0.12 to 0.89) and bleeding after multiple extractions (rate ratio, 0.40; 95% CI, 0.20 to 0.78) were lower in the TXA group. One patient in the placebo group had a transient ischemic attack while interrupting the NOAC therapy in preparation for the dental extraction. Two of the study limitations were the premature interruption of the trial following a futility analysis and the assessment of the patients\' compliance that was based on self-reported information during follow-up.
Conclusions
In patients on NOACs undergoing dental extraction, TXA does not seem to reduce the rate of periprocedural or early postoperative oral bleeding compared to placebo. TXA appears to reduce delayed bleeds and postoperative oral bleeding if multiple teeth are extracted.
Trial registration
ClinicalTrials.gov NCT03413891 EudraCT; EudraCT number:2017-001426-17; EudraCT Public website: eudract.ema.europa.eu.



PLoS Med: 02 May 2021; 18:e1003601
Ockerman A, Miclotte I, Vanhaverbeke M, Vanassche T, ... Politis C, Verhamme P
PLoS Med: 02 May 2021; 18:e1003601 | PMID: 33939696
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Circulating tumor DNA dynamics and recurrence risk in patients undergoing curative intent resection of colorectal cancer liver metastases: A prospective cohort study.

Tie J, Wang Y, Cohen J, Li L, ... Vogelstein B, Gibbs P
Background
In patients with resectable colorectal liver metastases (CRLM), the role of pre- and postoperative systemic therapy continues to be debated. Previous studies have shown that circulating tumor DNA (ctDNA) analysis, as a marker of minimal residual disease, is a powerful prognostic factor in patients with nonmetastatic colorectal cancer (CRC). Serial analysis of ctDNA in patients with resectable CRLM could inform the optimal use of perioperative chemotherapy. Here, we performed a validation study to confirm the prognostic impact of postoperative ctDNA in resectable CRLM observed in a previous discovery study.
Methods and findings
We prospectively collected plasma samples from patients with resectable CRLM, including presurgical and postsurgical samples, serial samples during any pre- or postoperative chemotherapy, and serial samples in follow-up. Via targeted sequencing of 15 genes commonly mutated in CRC, we identified at least 1 somatic mutation in each patient\'s tumor. We then designed a personalized assay to assess 1 mutation in plasma samples using the Safe-SeqS assay. A total of 380 plasma samples from 54 patients recruited from July 2011 to Dec 2014 were included in our analysis. Twenty-three (43%) patients received neoadjuvant chemotherapy, and 42 patients (78%) received adjuvant chemotherapy after surgery. Median follow-up was 51 months (interquartile range, 31 to 60 months). At least 1 somatic mutation was identified in all patients\' tumor tissue. ctDNA was detectable in 46/54 (85%) patients prior to any treatment and 12/49 (24%) patients after surgery. There was a median 40.93-fold (19.10 to 87.73, P < 0.001) decrease in ctDNA mutant allele fraction with neoadjuvant chemotherapy, but ctDNA clearance during neoadjuvant chemotherapy was not associated with a better recurrence-free survival (RFS). Patients with detectable postoperative ctDNA experienced a significantly lower RFS (HR 6.3; 95% CI 2.58 to 15.2; P < 0.001) and overall survival (HR 4.2; 95% CI 1.5 to 11.8; P < 0.001) compared to patients with undetectable ctDNA. For the 11 patients with detectable postoperative ctDNA who had serial ctDNA sampling during adjuvant chemotherapy, ctDNA clearance was observed in 3 patients, 2 of whom remained disease-free. All 8 patients with persistently detectable ctDNA after adjuvant chemotherapy have recurred. End-of-treatment (surgery +/- adjuvant chemotherapy) ctDNA detection was associated with a 5-year RFS of 0% compared to 75.6% for patients with an undetectable end-of-treatment ctDNA (HR 14.9; 95% CI 4.94 to 44.7; P < 0.001). Key limitations of the study include the small sample size and the potential for false-positive findings with multiple hypothesis testing.
Conclusions
We confirmed the prognostic impact of postsurgery and posttreatment ctDNA in patients with resected CRLM. The potential utility of serial ctDNA analysis during adjuvant chemotherapy as an early marker of treatment efficacy was also demonstrated. Further studies are required to define how to optimally integrate ctDNA analyses into decision-making regarding the use and timing of adjuvant therapy for resectable CRLM.
Trial registration
ACTRN12612000345886.



PLoS Med: 02 May 2021; 18:e1003620
Tie J, Wang Y, Cohen J, Li L, ... Vogelstein B, Gibbs P
PLoS Med: 02 May 2021; 18:e1003620 | PMID: 33939694
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Maximizing and evaluating the impact of test-trace-isolate programs: A modeling study.

Grantz KH, Lee EC, D\'Agostino McGowan L, Lee KH, ... Gurley ES, Lessler J
Background
Test-trace-isolate programs are an essential part of Coronavirus Disease 2019 (COVID-19) control that offer a more targeted approach than many other nonpharmaceutical interventions. Effective use of such programs requires methods to estimate their current and anticipated impact.
Methods and findings
We present a mathematical modeling framework to evaluate the expected reductions in the reproductive number, R, from test-trace-isolate programs. This framework is implemented in a publicly available R package and an online application. We evaluated the effects of completeness in case detection and contact tracing and speed of isolation and quarantine using parameters consistent with COVID-19 transmission (R0: 2.5, generation time: 6.5 days). We show that R is most sensitive to changes in the proportion of cases detected in almost all scenarios, and other metrics have a reduced impact when case detection levels are low (<30%). Although test-trace-isolate programs can contribute substantially to reducing R, exceptional performance across all metrics is needed to bring R below one through test-trace-isolate alone, highlighting the need for comprehensive control strategies. Results from this model also indicate that metrics used to evaluate performance of test-trace-isolate, such as the proportion of identified infections among traced contacts, may be misleading. While estimates of the impact of test-trace-isolate are sensitive to assumptions about COVID-19 natural history and adherence to isolation and quarantine, our qualitative findings are robust across numerous sensitivity analyses.
Conclusions
Effective test-trace-isolate programs first need to be strong in the \"test\" component, as case detection underlies all other program activities. Even moderately effective test-trace-isolate programs are an important tool for controlling the COVID-19 pandemic and can alleviate the need for more restrictive social distancing measures.



PLoS Med: 29 Apr 2021; 18:e1003585
Grantz KH, Lee EC, D'Agostino McGowan L, Lee KH, ... Gurley ES, Lessler J
PLoS Med: 29 Apr 2021; 18:e1003585 | PMID: 33930019
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Sexual and reproductive health information and referrals for resettled refugee women: A survey of resettlement agencies in the United States.

Katcher T, Thimmesch R, Spitz A, Kulkarni L, ... Weiner A, Woodford Martin M
Background
Refugee resettlement offices are the first point of contact for newly arrived refugees and play a significant role in helping refugees acclimate and settle into life in the United States. Available literature suggests that refugee women are vulnerable to poor sexual and reproductive health (SRH) outcomes, including sexually transmitted infections and HIV infections as well as adverse pregnancy outcomes, but little is known about the role that refugee resettlement offices play in supporting refugee women\'s SRH. This study examines the capacity and interest of resettlement offices in providing SRH information and referrals to newly arrived refugees.
Methods and findings
The research team conducted an online survey of staff members at refugee resettlement offices throughout the US in 2018 to determine (1) available SRH resources and workshops; (2) referrals to and assistance with making appointments for SRH and primary care appointments; (3) barriers to addressing SRH needs of clients; and (4) interest in building the capacity of office staff to address SRH issues. The survey was created for this study and had not been previously used or validated. Survey data underwent descriptive analysis. A total of 236 resettlement offices were contacted, with responses from 100 offices, for a total response rate of 42%. Fifteen percent (N = 15) of refugee resettlement agencies (RRAs) who responded to the survey provide materials about SRH to clients, and 49% (N = 49) incorporate sexual health into the classes they provide to newly arrived refugee clients. Moreover, 12% (N = 12) of responding RRAs screen clients for pregnancy intention, and 20% (N = 20) directly refer to contraceptive care and services. This study is limited by the response rate of the survey; no conclusions can be drawn about those offices that did not respond. In addition, the survey instrument was not validated against any other sources of information about the practices of refugee resettlement offices.
Conclusions
In this study, we observed that many resettlement offices do not routinely provide information or referrals for SRH needs. Responding offices cite lack of time and competing priorities as major barriers to providing SRH education and referrals to clients.



PLoS Med: 29 Apr 2021; 18:e1003579
Katcher T, Thimmesch R, Spitz A, Kulkarni L, ... Weiner A, Woodford Martin M
PLoS Med: 29 Apr 2021; 18:e1003579 | PMID: 33939705
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Efficacy of live attenuated and inactivated influenza vaccines among children in rural India: A 2-year, randomized, triple-blind, placebo-controlled trial.

Krishnan A, Dar L, Saha S, Narayan VV, ... Widdowson MA, Jain S
Background
Influenza is a cause of febrile acute respiratory infection (FARI) in India; however, few influenza vaccine trials have been conducted in India. We assessed absolute and relative efficacy of live attenuated influenza vaccine (LAIV) and inactivated influenza vaccine (IIV) among children aged 2 to 10 years in rural India through a randomized, triple-blind, placebo-controlled trial conducted over 2 years.
Methods and findings
In June 2015, children were randomly allocated to LAIV, IIV, intranasal placebo, or inactivated polio vaccine (IPV) in a 2:2:1:1 ratio. In June 2016, vaccination was repeated per original allocation. Overall, 3,041 children received LAIV (n = 1,015), IIV (n = 1,010), nasal placebo (n = 507), or IPV (n = 509). Mean age of children was 6.5 years with 20% aged 9 to 10 years. Through weekly home visits, nasal and throat swabs were collected from children with FARI and tested for influenza virus by polymerase chain reaction. The primary outcome was laboratory-confirmed influenza-associated FARI; vaccine efficacy (VE) was calculated using modified intention-to-treat (mITT) analysis by Cox proportional hazards model (PH) for each year. In Year 1, VE was 40.0% (95% confidence interval (CI) 25.2 to 51.9) for LAIV and 59.0% (95% CI 47.8 to 67.9) for IIV compared with controls; relative efficacy of LAIV compared with IIV was -46.2% (95% CI -88.9 to -13.1). In Year 2, VE was 51.9% (95% CI 42.0 to 60.1) for LAIV and 49.9% (95% CI 39.2 to 58.7) for IIV; relative efficacy of LAIV compared with IIV was 4.2% (95% CI -19.9 to 23.5). No serious adverse vaccine-attributable events were reported. Study limitations include differing dosage requirements for children between nasal and injectable vaccines (single dose of LAIV versus 2 doses of IIV) in Year 1 and the fact that immunogenicity studies were not conducted.
Conclusions
In this study, we found that LAIV and IIV vaccines were safe and moderately efficacious against influenza virus infection among Indian children.
Trial registration
Clinical Trials Registry of India CTRI/2015/06/005902.



PLoS Med: 28 Apr 2021; 18:e1003609
Krishnan A, Dar L, Saha S, Narayan VV, ... Widdowson MA, Jain S
PLoS Med: 28 Apr 2021; 18:e1003609 | PMID: 33914729
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Effectiveness of a primary care-based integrated mobile health intervention for stroke management in rural China (SINEMA): A cluster-randomized controlled trial.

Yan LL, Gong E, Gu W, Turner EL, ... Wang Y, Oldenburg B
Background
Managing noncommunicable diseases through primary healthcare has been identified as the key strategy to achieve universal health coverage but is challenging in most low- and middle-income countries. Stroke is the leading cause of death and disability in rural China. This study aims to determine whether a primary care-based integrated mobile health intervention (SINEMA intervention) could improve stroke management in rural China.
Methods and findings
Based on extensive barrier analyses, contextual research, and feasibility studies, we conducted a community-based, two-arm cluster-randomized controlled trial with blinded outcome assessment in Hebei Province, rural Northern China including 1,299 stroke patients (mean age: 65.7 [SD:8.2], 42.6% females, 71.2% received education below primary school) recruited from 50 villages between June 23 and July 21, 2017. Villages were randomly assigned (1:1) to either the intervention or control arm (usual care). In the intervention arm, village doctors who were government-sponsored primary healthcare providers received training, conducted monthly follow-up visits supported by an Android-based mobile application, and received performance-based payments. Participants received monthly doctor visits and automatically dispatched daily voice messages. The primary outcome was the 12-month change in systolic blood pressure (BP). Secondary outcomes were predefined, including diastolic BP, health-related quality of life, physical activity level, self-reported medication adherence (antiplatelet, statin, and antihypertensive), and performance in \"timed up and go\" test. Analyses were conducted in the intention-to-treat framework at the individual level with clusters and stratified design accounted for by following the prepublished statistical analysis plan. All villages completed the 12-month follow-up, and 611 (intervention) and 615 (control) patients were successfully followed (3.4% lost to follow-up among survivors). The program was implemented with high fidelity, and the annual program delivery cost per capita was US$24.3. There was a significant reduction in systolic BP in the intervention as compared with the control group with an adjusted mean difference: -2.8 mm Hg (95% CI -4.8, -0.9; p = 0.005). The intervention was significantly associated with improvements in 6 out of 7 secondary outcomes in diastolic BP reduction (p < 0.001), health-related quality of life (p = 0.008), physical activity level (p < 0.001), adherence in statin (p = 0.003) and antihypertensive medicines (p = 0.039), and performance in \"timed up and go\" test (p = 0.022). We observed reductions in all exploratory outcomes, including stroke recurrence (4.4% versus 9.3%; risk ratio [RR] = 0.46, 95% CI 0.32, 0.66; risk difference [RD] = 4.9 percentage points [pp]), hospitalization (4.4% versus 9.3%; RR = 0.45, 95% CI 0.32, 0.62; RD = 4.9 pp), disability (20.9% versus 30.2%; RR = 0.65, 95% CI 0.53, 0.79; RD = 9.3 pp), and death (1.8% versus 3.1%; RR = 0.52, 95% CI 0.28, 0.96; RD = 1.3 pp). Limitations include the relatively short study duration of only 1 year and the generalizability of our findings beyond the study setting.
Conclusions
In this study, a primary care-based mobile health intervention integrating provider-centered and patient-facing technology was effective in reducing BP and improving stroke secondary prevention in a resource-limited rural setting in China.
Trial registration
The SINEMA trial is registered at ClinicalTrials.gov NCT03185858.



PLoS Med: 27 Apr 2021; 18:e1003582
Yan LL, Gong E, Gu W, Turner EL, ... Wang Y, Oldenburg B
PLoS Med: 27 Apr 2021; 18:e1003582 | PMID: 33909607
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

International gestational age-specific centiles for blood pressure in pregnancy from the INTERGROWTH-21st Project in 8 countries: A longitudinal cohort study.

Green LJ, Kennedy SH, Mackillop L, Gerry S, ... Watkinson P, International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st)
Background
Gestational hypertensive and acute hypotensive disorders are associated with maternal morbidity and mortality worldwide. However, physiological blood pressure changes in pregnancy are insufficiently defined. We describe blood pressure changes across healthy pregnancies from the International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st) Fetal Growth Longitudinal Study (FGLS) to produce international, gestational age-specific, smoothed centiles (third, 10th, 50th, 90th, and 97th) for blood pressure.
Methods and findings
Secondary analysis of a prospective, longitudinal, observational cohort study (2009 to 2016) was conducted across 8 diverse urban areas in Brazil, China, India, Italy, Kenya, Oman, the United Kingdom, and the United States of America. We enrolled healthy women at low risk of pregnancy complications. We measured blood pressure using standardised methodology and validated equipment at enrolment at <14 weeks, then every 5 ± 1 weeks until delivery. We enrolled 4,607 (35%) women of 13,108 screened. The mean maternal age was 28·4 (standard deviation [SD] 3.9) years; 97% (4,204/4,321) of women were married or living with a partner, and 68% (2,955/4,321) were nulliparous. Their mean body mass index (BMI) was 23.3 (SD 3.0) kg/m2. Systolic blood pressure was lowest at 12 weeks: Median was 111.5 (95% CI 111.3 to 111.8) mmHg, rising to a median maximum of 119.6 (95% CI 118.9 to 120.3) mmHg at 40 weeks\' gestation, a difference of 8.1 (95% CI 7.4 to 8.8) mmHg. Median diastolic blood pressure decreased from 12 weeks: 69.1 (95% CI 68.9 to 69.3) mmHg to a minimum of 68.5 (95% CI 68.3 to 68.7) mmHg at 19+5 weeks\' gestation, a change of -0·6 (95% CI -0.8 to -0.4) mmHg. Diastolic blood pressure subsequently increased to a maximum of 76.3 (95% CI 75.9 to 76.8) mmHg at 40 weeks\' gestation. Systolic blood pressure fell by >14 mmHg or diastolic blood pressure by >11 mmHg in fewer than 10% of women at any gestational age. Fewer than 10% of women increased their systolic blood pressure by >24 mmHg or diastolic blood pressure by >18 mmHg at any gestational age. The study\'s main limitations were the unavailability of prepregnancy blood pressure values and inability to explore circadian effects because time of day was not recorded for the blood pressure measurements.
Conclusions
Our findings provide international, gestational age-specific centiles and limits of acceptable change to facilitate earlier recognition of deteriorating health in pregnant women. These centiles challenge the idea of a clinically significant midpregnancy drop in blood pressure.



PLoS Med: 26 Apr 2021; 18:e1003611
Green LJ, Kennedy SH, Mackillop L, Gerry S, ... Watkinson P, International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st)
PLoS Med: 26 Apr 2021; 18:e1003611 | PMID: 33905424
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Anomalously warm weather and acute care visits in patients with multiple sclerosis: A retrospective study of privately insured individuals in the US.

Elser H, Parks RM, Moghavem N, Kiang MV, ... Rehkopf DH, Casey JA
Background
As the global climate changes in response to anthropogenic greenhouse gas emissions, weather and temperature are expected to become increasingly variable. Although heat sensitivity is a recognized clinical feature of multiple sclerosis (MS), a chronic demyelinating disorder of the central nervous system, few studies have examined the implications of climate change for patients with this disease.
Methods and findings
We conducted a retrospective cohort study of individuals with MS ages 18-64 years in a nationwide United States patient-level commercial and Medicare Advantage claims database from 2003 to 2017. We defined anomalously warm weather as any month in which local average temperatures exceeded the long-term average by ≥1.5°C. We estimated the association between anomalously warm weather and MS-related inpatient, outpatient, and emergency department visits using generalized log-linear models. From 75,395,334 individuals, we identified 106,225 with MS. The majority were women (76.6%) aged 36-55 years (59.0%). Anomalously warm weather was associated with increased risk for emergency department visits (risk ratio [RR] = 1.043, 95% CI: 1.025-1.063) and inpatient visits (RR = 1.032, 95% CI: 1.010-1.054). There was limited evidence of an association between anomalously warm weather and MS-related outpatient visits (RR = 1.010, 95% CI: 1.005-1.015). Estimates were similar for men and women, strongest among older individuals, and exhibited substantial variation by season, region, and climate zone. Limitations of the present study include the absence of key individual-level measures of socioeconomic position (i.e., race/ethnicity, occupational status, and housing quality) that may determine where individuals live-and therefore the extent of their exposure to anomalously warm weather-as well as their propensity to seek treatment for neurologic symptoms.
Conclusions
Our findings suggest that as global temperatures rise, individuals with MS may represent a particularly susceptible subpopulation, a finding with implications for both healthcare providers and systems.



PLoS Med: 25 Apr 2021; 18:e1003580
Elser H, Parks RM, Moghavem N, Kiang MV, ... Rehkopf DH, Casey JA
PLoS Med: 25 Apr 2021; 18:e1003580 | PMID: 33901187
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Dynamics of sputum conversion during effective tuberculosis treatment: A systematic review and meta-analysis.

Calderwood CJ, Wilson JP, Fielding KL, Harris RC, ... Theodorou H, Moore DAJ
Background
Two weeks\' isolation is widely recommended for people commencing treatment for pulmonary tuberculosis (TB). The evidence that this corresponds to clearance of potentially infectious tuberculous mycobacteria in sputum is not well established. This World Health Organization-commissioned review investigated sputum sterilisation dynamics during TB treatment.
Methods and findings
For the main analysis, 2 systematic literature searches of OvidSP MEDLINE, Embase, and Global Health, and EBSCO CINAHL Plus were conducted to identify studies with data on TB infectiousness (all studies to search date, 1 December 2017) and all randomised controlled trials (RCTs) for drug-susceptible TB (from 1 January 1990 to search date, 20 February 2018). Included articles reported on patients receiving effective treatment for culture-confirmed drug-susceptible pulmonary TB. The outcome of interest was sputum bacteriological conversion: the proportion of patients having converted by a defined time point or a summary measure of time to conversion, assessed by smear or culture. Any study design where more than 10 participants were included was considered. Record sifting and data extraction were performed in duplicate. Random effects meta-analyses were performed. A narrative summary additionally describes the results of a systematic search for data evaluating infectiousness from humans to experimental animals (PubMed, all studies to 27 March 2018). Other evidence on duration of infectiousness-including studies reporting on cough dynamics, human tuberculin skin test conversion, or early bactericidal activity of TB treatments-was outside the scope of this review. The literature search was repeated on 22 November 2020, at the request of the editors, to identify studies published after the previous censor date. Four small studies reporting 3 different outcome measures were identified, which included no data that would alter the findings of the review; they are not included in the meta-analyses. Of 5,290 identified records, 44 were included. Twenty-seven (61%) were RCTs and 17 (39%) were cohort studies. Thirteen studies (30%) reported data from Africa, 12 (27%) from Asia, 6 (14%) from South America, 5 (11%) from North America, and 4 (9%) from Europe. Four studies reported data from multiple continents. Summary estimates suggested smear conversion in 9% of patients at 2 weeks (95% CI 3%-24%, 1 single study [N = 1]), and 82% of patients at 2 months of treatment (95% CI 78%-86%, N = 10). Among baseline smear-positive patients, solid culture conversion occurred by 2 weeks in 5% (95% CI 0%-14%, N = 2), increasing to 88% at 2 months (95% CI 84%-92%, N = 20). At equivalent time points, liquid culture conversion was achieved in 3% (95% CI 1%-16%, N = 1) and 59% (95% CI 47%-70%, N = 8). Significant heterogeneity was observed. Further interrogation of the data to explain this heterogeneity was limited by the lack of disaggregation of results, including by factors such as HIV status, baseline smear status, and the presence or absence of lung cavitation.
Conclusions
This systematic review found that most patients remained culture positive at 2 weeks of TB treatment, challenging the view that individuals are not infectious after this interval. Culture positivity is, however, only 1 component of infectiousness, with reduced cough frequency and aerosol generation after TB treatment initiation likely to also be important. Studies that integrate our findings with data on cough dynamics could provide a more complete perspective on potential transmission of Mycobacterium tuberculosis by individuals on treatment.
Trial registration
Systematic review registration: PROSPERO 85226.



PLoS Med: 25 Apr 2021; 18:e1003566
Calderwood CJ, Wilson JP, Fielding KL, Harris RC, ... Theodorou H, Moore DAJ
PLoS Med: 25 Apr 2021; 18:e1003566 | PMID: 33901173
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Estimating and characterizing the burden of multimorbidity in the community: A comprehensive multistep analysis of two large nationwide representative surveys in France.

Coste J, Valderas JM, Carcaillon-Bentata L
Background
Given the increasing burden of chronic conditions, multimorbidity is now a priority for healthcare and public health systems worldwide. Appropriate methodological approaches for assessing the phenomenon have not yet been established, resulting in inconsistent and incomplete descriptions. We aimed to estimate and characterize the burden of multimorbidity in the adult population in France in terms of number and type of conditions, type of underlying mechanisms, and analysis of the joint effects for identifying combinations with the most deleterious interaction effects on health status.
Methods and findings
We used a multistep approach to analyze cross-sectional and longitudinal data from 2 large nationwide representative surveys: 2010/2014 waves of the Health, Health Care, and Insurance Survey (ESPS 2010-2014) and Disability Healthcare Household Survey 2008 (HSM 2008), that collected similar data on 61 chronic or recurrent conditions. Adults aged ≥25 years in either ESPS 2010 (14,875) or HSM 2008 (23,348) were considered (participation rates were 65% and 62%, respectively). Longitudinal analyses included 7,438 participants of ESPS 2010 with follow-up for mortality (97%) of whom 3,798 were reinterviewed in 2014 (52%). Mortality, activity limitation, self-reported health, difficulties in activities/instrumental activities of daily living, and Medical Outcomes Study Short-Form 12-Item Health Survey were the health status measures. Multiple regression models were used to estimate the impact of chronic or recurrent conditions and multimorbid associations (dyads, triads, and tetrads) on health status. Etiological pathways explaining associations were investigated, and joint effects and interactions between conditions on health status measures were evaluated using both additive and multiplicative scales. Forty-eight chronic or recurrent conditions had an independent impact on mortality, activity limitations, or perceived heath. Multimorbidity prevalence varied between 30% (1-year time frame) and 39% (lifetime frame), and more markedly according to sex (higher in women), age (with greatest increases in middle-aged), and socioeconomic status (higher in less educated and low-income individuals and manual workers). We identified various multimorbid combinations, mostly involving vasculometabolic and musculoskeletal conditions and mental disorders, which could be explained by direct causation, shared or associated risk factors, or less frequently, confounding or chance. Combinations with the highest health impacts included diseases with complications but also associations of conditions affecting systems involved in locomotion and sensorial functions (impact on activity limitations), and associations including mental disorders (impact on perceived health). The interaction effects of the associated conditions varied on a continuum from subadditive and additive (associations involving cardiometabolic conditions, low back pain, osteoporosis, injury sequelae, depression, and anxiety) to multiplicative and supermultiplicative (associations involving obesity, chronic obstructive pulmonary disease, migraine, and certain osteoarticular pathologies). Study limitations included self-reported information on chronic conditions and the insufficient power of some analyses.
Conclusions
Multimorbidity assessments should move beyond simply counting conditions and take into account the variable impacts on health status, etiological pathways, and joint effects of associated conditions. In particular, the multimorbid combinations with substantial health impacts or shared risk factors deserve closer attention. Our findings also suggest that multimorbidity assessment and management may be beneficial already in midlife and probably earlier in disadvantaged groups.



PLoS Med: 25 Apr 2021; 18:e1003584
Coste J, Valderas JM, Carcaillon-Bentata L
PLoS Med: 25 Apr 2021; 18:e1003584 | PMID: 33901171
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Housing environment and early childhood development in sub-Saharan Africa: A cross-sectional analysis.

Gao Y, Zhang L, Kc A, Wang Y, ... Mi X, Zhou H
Background
The influence of the safety and security of environments on early childhood development (ECD) has been under-explored. Although housing might be linked to ECD by affecting a child\'s health and a parent\'s ability to provide adequate care, only a few studies have examined this factor. We hypothesized that housing environment is associated with ECD in sub-Saharan Africa (SSA).
Methods and findings
From 92,433 children aged 36 to 59 months who participated in Multiple Indicator Cluster Survey (MICS) in 20 SSA countries, 88,271 were tested for cognitive and social-emotional development using the Early Childhood Development Index (ECDI) questionnaire and were thus included in this cross-sectional analysis. Children\'s mean age was 47.2 months, and 49.8% were girls. Children were considered developmentally on track in a certain domain if they failed no more than 1 ECDI item in that domain. In each country, we used conditional logistic regression models to estimate the association between improved housing (housing with finished building materials, improved drinking water, improved sanitation facilities, and sufficient living area) and children\'s cognitive and social-emotional development, accounting for contextual effects and socioeconomic factors. Estimates from each country were pooled using random-effects meta-analyses. Subgroup analyses were conducted by the child\'s gender, maternal education, and household wealth quintiles. On-track cognitive development was associated with improved housing (odds ratio [OR] = 1.15, 95% CI 1.06 to 1.24, p < 0.001), improved drinking water (OR = 1.07, 95% CI 1.00 to 1.14, p = 0.046), improved sanitation facilities (OR = 1.15, 95% CI 1.03 to 1.28, p = 0.014), and sufficient living area (OR = 1.06, 95% CI 1.01 to 1.10, p = 0.018). On-track social-emotional development was associated with improved housing only in girls (OR = 1.14, 95% CI 1.04 to 1.25, p = 0.006). The main limitations of this study included the cross-sectional nature of the datasets and the use of the ECDI, which lacks sensitivity to measure ECD outcomes.
Conclusions
In this study, we observed that improved housing was associated with on-track cognitive development and with on-track social-emotional development in girls. These findings suggest that housing improvement in SSA may be associated not only with benefits for children\'s physical health but also with broader aspects of healthy child development.



PLoS Med: 18 Apr 2021; 18:e1003578
Gao Y, Zhang L, Kc A, Wang Y, ... Mi X, Zhou H
PLoS Med: 18 Apr 2021; 18:e1003578 | PMID: 33872322
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Health information technology interventions and engagement in HIV care and achievement of viral suppression in publicly funded settings in the US: A cost-effectiveness analysis.

Shade SB, Marseille E, Kirby V, Chakravarty D, ... Cajina A, Myers JJ
Background
The US National HIV/AIDS Strategy (NHAS) emphasizes the use of technology to facilitate coordination of comprehensive care for people with HIV. We examined cost-effectiveness from the health system perspective of 6 health information technology (HIT) interventions implemented during 2008 to 2012 in a Ryan White HIV/AIDS Program (RWHAP) Special Projects of National Significance (SPNS) Program demonstration project.
Methods/findings
HIT interventions were implemented at 6 sites: Bronx, New York; Durham, North Carolina; Long Beach, California; New Orleans, Louisiana; New York, New York (2 sites); and Paterson, New Jersey. These interventions included: (1) use of HIV surveillance data to identify out-of-care individuals; (2) extension of access to electronic health records (EHRs) to support service providers; (3) use of electronic laboratory ordering and prescribing; and (4) development of a patient portal. We employed standard microcosting techniques to estimate costs (in 2018 US dollars) associated with intervention implementation. Data from a sample of electronic patient records from each demonstration site were analyzed to compare prescription of antiretroviral therapy (ART), CD4 cell counts, and suppression of viral load, before and after implementation of interventions. Markov models were used to estimate additional healthcare costs and quality-adjusted life-years saved as a result of each intervention. Overall, demonstration site interventions cost $3,913,313 (range = $287,682 to $998,201) among 3,110 individuals (range = 258 to 1,181) over 3 years. Changes in the proportion of patients prescribed ART ranged from a decrease from 87.0% to 72.7% at Site 4 to an increase from 74.6% to 94.2% at Site 6; changes in the proportion of patients with 0 to 200 CD4 cells/mm3 ranged from a decrease from 20.2% to 11.0% in Site 6 to an increase from 16.7% to 30.2% in Site 2; and changes in the proportion of patients with undetectable viral load ranged from a decrease from 84.6% to 46.0% in Site 1 to an increase from 67.0% to 69.9% in Site 5. Four of the 6 interventions-including use of HIV surveillance data to identify out-of-care individuals, use of electronic laboratory ordering and prescribing, and development of a patient portal-were not only cost-effective but also cost saving ($6.87 to $14.91 saved per dollar invested). In contrast, the 2 interventions that extended access to EHRs to support service providers were not effective and, therefore, not cost-effective. Most interventions remained either cost-saving or not cost-effective under all sensitivity analysis scenarios. The intervention that used HIV surveillance data to identify out-of-care individuals was no longer cost-saving when the effect of HIV on an individual\'s health status was reduced and when the natural progression of HIV was increased. The results of this study are limited in that we did not have contemporaneous controls for each intervention; thus, we are only able to assess sites against themselves at baseline and not against standard of care during the same time period.
Conclusions
These results provide additional support for the use of HIT as a tool to enhance rapid and effective treatment of HIV to achieve sustained viral suppression. HIT has the potential to increase utilization of services, improve health outcomes, and reduce subsequent transmission of HIV.



PLoS Med: 06 Apr 2021; 18:e1003389
Shade SB, Marseille E, Kirby V, Chakravarty D, ... Cajina A, Myers JJ
PLoS Med: 06 Apr 2021; 18:e1003389 | PMID: 33826617
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Epidemiological, clinical, and public health response characteristics of a large outbreak of diphtheria among the Rohingya population in Cox\'s Bazar, Bangladesh, 2017 to 2019: A retrospective study.

Polonsky JA, Ivey M, Mazhar KA, Rahman Z, ... Salam A, White K
Background
Unrest in Myanmar in August 2017 resulted in the movement of over 700,000 Rohingya refugees to overcrowded camps in Cox\'s Bazar, Bangladesh. A large outbreak of diphtheria subsequently began in this population.
Methods and findings
Data were collected during mass vaccination campaigns (MVCs), contact tracing activities, and from 9 Diphtheria Treatment Centers (DTCs) operated by national and international organizations. These data were used to describe the epidemiological and clinical features and the control measures to prevent transmission, during the first 2 years of the outbreak. Between November 10, 2017 and November 9, 2019, 7,064 cases were reported: 285 (4.0%) laboratory-confirmed, 3,610 (51.1%) probable, and 3,169 (44.9%) suspected cases. The crude attack rate was 51.5 cases per 10,000 person-years, and epidemic doubling time was 4.4 days (95% confidence interval [CI] 4.2-4.7) during the exponential growth phase. The median age was 10 years (range 0-85), and 3,126 (44.3%) were male. The typical symptoms were sore throat (93.5%), fever (86.0%), pseudomembrane (34.7%), and gross cervical lymphadenopathy (GCL; 30.6%). Diphtheria antitoxin (DAT) was administered to 1,062 (89.0%) out of 1,193 eligible patients, with adverse reactions following among 229 (21.6%). There were 45 deaths (case fatality ratio [CFR] 0.6%). Household contacts for 5,702 (80.7%) of 7,064 cases were successfully traced. A total of 41,452 contacts were identified, of whom 40,364 (97.4%) consented to begin chemoprophylaxis; adherence was 55.0% (N = 22,218) at 3-day follow-up. Unvaccinated household contacts were vaccinated with 3 doses (with 4-week interval), while a booster dose was administered if the primary vaccination schedule had been completed. The proportion of contacts vaccinated was 64.7% overall. Three MVC rounds were conducted, with administrative coverage varying between 88.5% and 110.4%. Pentavalent vaccine was administered to those aged 6 weeks to 6 years, while tetanus and diphtheria (Td) vaccine was administered to those aged 7 years and older. Lack of adequate diagnostic capacity to confirm cases was the main limitation, with a majority of cases unconfirmed and the proportion of true diphtheria cases unknown.
Conclusions
To our knowledge, this is the largest reported diphtheria outbreak in refugee settings. We observed that high population density, poor living conditions, and fast growth rate were associated with explosive expansion of the outbreak during the initial exponential growth phase. Three rounds of mass vaccinations targeting those aged 6 weeks to 14 years were associated with only modestly reduced transmission, and additional public health measures were necessary to end the outbreak. This outbreak has a long-lasting tail, with Rt oscillating at around 1 for an extended period. An adequate global DAT stockpile needs to be maintained. All populations must have access to health services and routine vaccination, and this access must be maintained during humanitarian crises.



PLoS Med: 31 Mar 2021; 18:e1003587
Polonsky JA, Ivey M, Mazhar KA, Rahman Z, ... Salam A, White K
PLoS Med: 31 Mar 2021; 18:e1003587 | PMID: 33793554
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Maternal weight change from prepregnancy to 18 months postpartum and subsequent risk of hypertension and cardiovascular disease in Danish women: A cohort study.

Kirkegaard H, Bliddal M, Støvring H, Rasmussen KM, ... Sørensen TIA, Nøhr EA
Background
One-fourth of women experience substantially higher weight years after childbirth. We examined weight change from prepregnancy to 18 months postpartum according to subsequent maternal risk of hypertension and cardiovascular disease (CVD).
Methods and findings
We conducted a cohort study of 47,966 women with a live-born singleton within the Danish National Birth Cohort (DNBC; 1997-2002). Interviews during pregnancy and 6 and 18 months postpartum provided information on height, gestational weight gain (GWG), postpartum weights, and maternal characteristics. Information on pregnancy complications, incident hypertension, and CVD was obtained from the National Patient Register. Using Cox regression, we estimated adjusted hazard ratios (HRs; 95% confidence interval [CI]) for hypertension and CVD through 16 years of follow-up. During this period, 2,011 women were diagnosed at the hospital with hypertension and 1,321 with CVD. The women were on average 32.3 years old (range 18.0-49.2) at start of follow-up, 73% had a prepregnancy BMI <25, and 27% a prepregnancy BMI ≥25. Compared with a stable weight (±1 BMI unit), weight gains from prepregnancy to 18 months postpartum of >1-2 and >2 BMI units were associated with 25% (10%-42%), P = 0.001 and 31% (14%-52%), P < 0.001 higher risks of hypertension, respectively. These risks were similar whether weight gain presented postpartum weight retention or a new gain from 6 months to 18 months postpartum and whether GWG was below, within, or above the recommendations. For CVD, findings differed according to prepregnancy BMI. In women with normal-/underweight, weight gain >2 BMI units and weight loss >1 BMI unit were associated with 48% (17%-87%), P = 0.001 and 28% (6%-55%), P = 0.01 higher risks of CVD, respectively. Further, weight loss >1 BMI unit combined with a GWG below recommended was associated with a 70% (24%-135%), P = 0.001 higher risk of CVD. No such increased risks were observed among women with overweight/obesity (interaction by prepregnancy BMI, P = 0.01, 0.03, and 0.03, respectively). The limitations of this observational study include potential confounding by prepregnancy metabolic health and self-reported maternal weights, which may lead to some misclassification.
Conclusions
Postpartum weight retention/new gain in all mothers and postpartum weight loss in mothers with normal-/underweight may be associated with later adverse cardiovascular health.



PLoS Med: 30 Mar 2021; 18:e1003486
Kirkegaard H, Bliddal M, Støvring H, Rasmussen KM, ... Sørensen TIA, Nøhr EA
PLoS Med: 30 Mar 2021; 18:e1003486 | PMID: 33798198
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Estimated impact of tafenoquine for Plasmodium vivax control and elimination in Brazil: A modelling study.

Nekkab N, Lana R, Lacerda M, Obadia T, ... Mueller I, White M
Background
Despite recent intensification of control measures, Plasmodium vivax poses a major challenge for malaria elimination efforts. Liver-stage hypnozoite parasites that cause relapsing infections can be cleared with primaquine; however, poor treatment adherence undermines drug effectiveness. Tafenoquine, a new single-dose treatment, offers an alternative option for preventing relapses and reducing transmission. In 2018, over 237,000 cases of malaria were reported to the Brazilian health system, of which 91.5% were due to P. vivax.
Methods and findings
We evaluated the impact of introducing tafenoquine into case management practices on population-level transmission dynamics using a mathematical model of P. vivax transmission. The model was calibrated to reflect the transmission dynamics of P. vivax endemic settings in Brazil in 2018, informed by nationwide malaria case reporting data. Parameters for treatment pathways with chloroquine, primaquine, and tafenoquine with glucose-6-phosphate dehydrogenase deficiency (G6PDd) testing were informed by clinical trial data and the literature. We assumed 71.3% efficacy for primaquine and tafenoquine, a 66.7% adherence rate to the 7-day primaquine regimen, a mean 5.5% G6PDd prevalence, and 8.1% low metaboliser prevalence. The introduction of tafenoquine is predicted to improve effective hypnozoite clearance among P. vivax cases and reduce population-level transmission over time, with heterogeneous levels of impact across different transmission settings. According to the model, while achieving elimination in only few settings in Brazil, tafenoquine rollout in 2021 is estimated to improve the mean effective radical cure rate from 42% (95% uncertainty interval [UI] 41%-44%) to 62% (95% UI 54%-68%) among clinical cases, leading to a predicted 38% (95% UI 7%-99%) reduction in transmission and over 214,000 cumulative averted cases between 2021 and 2025. Higher impact is predicted in settings with low transmission, low pre-existing primaquine adherence, and a high proportion of cases in working-aged males. High-transmission settings with a high proportion of cases in children would benefit from a safe high-efficacy tafenoquine dose for children. Our methodological limitations include not accounting for the role of imported cases from outside the transmission setting, relying on reported clinical cases as a measurement of community-level transmission, and implementing treatment efficacy as a binary condition.
Conclusions
In our modelling study, we predicted that, provided there is concurrent rollout of G6PDd diagnostics, tafenoquine has the potential to reduce P. vivax transmission by improving effective radical cure through increased adherence and increased protection from new infections. While tafenoquine alone may not be sufficient for P. vivax elimination, its introduction will improve case management, prevent a substantial number of cases, and bring countries closer to achieving malaria elimination goals.



PLoS Med: 30 Mar 2021; 18:e1003535
Nekkab N, Lana R, Lacerda M, Obadia T, ... Mueller I, White M
PLoS Med: 30 Mar 2021; 18:e1003535 | PMID: 33891582
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Glucose-6-phosphate dehydrogenase activity in individuals with and without malaria: Analysis of clinical trial, cross-sectional and case-control data from Bangladesh.

Ley B, Alam MS, Kibria MG, Marfurt J, ... Khan WA, Price RN
Background
Glucose-6-phosphate dehydrogenase (G6PD) activity is dependent upon G6PD genotype and age of the red blood cell (RBC) population, with younger RBCs having higher activity. Peripheral parasitemia with Plasmodium spp. induces hemolysis, replacing older RBCs with younger cells with higher G6PD activity. This study aimed to assess whether G6PD activity varies between individuals with and without malaria or a history of malaria.
Methods and findings
Individuals living in the Chittagong Hill Tracts of Bangladesh were enrolled into 3 complementary studies: (i) a prospective, single-arm clinical efficacy trial of patients (n = 175) with uncomplicated malaria done between 2014 and 2015, (ii) a cross-sectional survey done between 2015 and 2016 (n = 999), and (iii) a matched case-control study of aparasitemic individuals with and without a history of malaria done in 2020 (n = 506). G6PD activity was compared between individuals with and without malaria diagnosed by microscopy, rapid diagnostic test (RDT), or polymerase chain reaction (PCR), and in aparasitemic participants with and without a history of malaria. In the cross-sectional survey and clinical trial, 15.5% (182/1,174) of participants had peripheral parasitemia detected by microscopy or RDT, 3.1% (36/1,174) were positive by PCR only, and 81.4% (956/1,174) were aparasitemic. Aparasitemic individuals had significantly lower G6PD activity (median 6.9 U/g Hb, IQR 5.2-8.6) than those with peripheral parasitemia detected by microscopy or RDT (7.9 U/g Hb, IQR 6.6-9.8, p < 0.001), but G6PD activity similar to those with parasitemia detected by PCR alone (submicroscopic parasitemia) (6.1 U/g Hb, IQR 4.8-8.6, p = 0.312). In total, 7.7% (14/182) of patients with malaria had G6PD activity < 70% compared to 25.0% (248/992) of participants with submicroscopic or no parasitemia (odds ratio [OR] 0.25, 95% CI 0.14-0.44, p < 0.001). In the case-control study, the median G6PD activity was 10.3 U/g Hb (IQR 8.8-12.2) in 253 patients with a history of malaria and 10.2 U/g Hb (IQR 8.7-11.8) in 253 individuals without a history of malaria (p = 0.323). The proportion of individuals with G6PD activity < 70% was 11.5% (29/253) in the cases and 15.4% (39/253) in the controls (OR 0.7, 95% CI 0.41-1.23, p = 0.192). Limitations of the study included the non-contemporaneous nature of the clinical trial and cross-sectional survey.
Conclusions
Patients with acute malaria had significantly higher G6PD activity than individuals without malaria, and this could not be accounted for by a protective effect of G6PD deficiency. G6PD-deficient patients with malaria may have higher than expected G6PD enzyme activity and an attenuated risk of primaquine-induced hemolysis compared to the risk when not infected.



PLoS Med: 30 Mar 2021; 18:e1003576
Ley B, Alam MS, Kibria MG, Marfurt J, ... Khan WA, Price RN
PLoS Med: 30 Mar 2021; 18:e1003576 | PMID: 33891581
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Healthcare without borders: A cross-sectional study of immigrant and nonimmigrant children admitted to a large public sector hospital in the Gauteng Province of South Africa.

Janse van Rensburg GH, Feucht UD, Makin J, le Clus N, Avenant T
Background
Human migration is a worldwide phenomenon that receives considerable attention from the media and healthcare authorities alike. A significant proportion of children seen at public sector health facilities in South Africa (SA) are immigrants, and gaps have previously been noted in their healthcare provision. The objective of the study was to describe the characteristics and differences between the immigrant and SA children admitted to Kalafong Provincial Tertiary Hospital (KPTH), a large public sector hospital in the urban Gauteng Province of SA.
Methods and findings
A cross-sectional study was conducted over a 4-month period during 2016 to 2017. Information was obtained through a structured questionnaire and health record review. The enrolled study participants included 508 children divided into 2 groups, namely 271 general paediatric patients and 237 neonates. Twenty-five percent of children in the neonatal group and 22.5% in the general paediatric group were immigrants. The parents/caregivers of the immigrant group had a lower educational level (p < 0.0001 neonatal and paediatric), lower income (neonatal p < 0.001; paediatric p = 0.024), difficulty communicating in English (p < 0.001 neonatal and paediatric), and were more likely residing in informal settlements (neonatal p = 0.001; paediatric p = 0.007) compared to the SA group. In the neonatal group, there was no difference in the number of antenatal care (ANC) visits, type of delivery, gestational age, and birth weight. In the general paediatric group, there was no difference in immunisation and vitamin A supplementation coverage, but when comparing growth, the immigrant group had more malnutrition compared to the SA group (p = 0.029 for wasting). There was no difference in the prevalence of maternal human immunodeficiency virus (HIV) infection, with equally good prevention of mother-to-child transmission (PMTCT) coverage. There was also no difference in reported difficulties by immigrants in terms of access to healthcare (neonatal p = 0.379; paediatric p = 0.246), although a large proportion (10%) of the neonates of immigrant mothers were born outside a medical facility.
Conclusions
Although there were health-related differences between immigrant and SA children accessing in-hospital care, these were fewer than expected. Differences were found in parental educational level and socioeconomic factors, but these did not significantly affect ANC attendance, delivery outcomes, immunisation coverage, HIV prevalence, or PMTCT coverage. The immigrant population should be viewed as a high-risk group, with potential problems including suboptimal child growth. Health workers should advocate for all children in the community they are serving and promote tolerance, respect, and equal healthcare access.



PLoS Med: 27 Feb 2021; 18:e1003565
Janse van Rensburg GH, Feucht UD, Makin J, le Clus N, Avenant T
PLoS Med: 27 Feb 2021; 18:e1003565 | PMID: 33755665
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Safety and tolerability of natural and synthetic cannabinoids in adults aged over 50 years: A systematic review and meta-analysis.

Velayudhan L, McGoohan K, Bhattacharyya S
Background
Cannabinoid-based medicines (CBMs) are being used widely in the elderly. However, their safety and tolerability in older adults remains unclear. We aimed to conduct a systematic review and meta-analysis of safety and tolerability of CBMs in adults of age ≥50 years.
Methods and findings
A systematic search was performed using MEDLINE, PubMed, EMBASE, CINAHL PsychInfo, Cochrane Library, and ClinicalTrials.gov (1 January 1990 to 3 October 2020). Randomised clinical trials (RCTs) of CBMs in those with mean age of ≥50 years for all indications, evaluating the safety/tolerability of CBMs where adverse events have been quantified, were included. Study quality was assessed using the GRADE (Grading of Recommendations Assessment, Development, and Evaluation) criteria and Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines were followed. Two reviewers conducted all review stages independently. Where possible, data were pooled using random-effects meta-analysis. Effect sizes were calculated as incident rate ratio (IRR) for outcome data such as adverse events (AEs), serious AEs (SAEs), and death and risk ratio (RR) for withdrawal from study and reported separately for studies using tetrahydrocannabinol (THC), THC:cannabidiol (CBD) combination, and CBD. A total of 46 RCTs were identified as suitable for inclusion of which 31 (67%) were conducted in the United Kingdom and Europe. There were 6,216 patients (mean age 58.6 ± 7.5 years; 51% male) included in the analysis, with 3,469 receiving CBMs. Compared with controls, delta-9-tetrahydrocannabinol (THC)-containing CBMs significantly increased the incidence of all-cause and treatment-related AEs: THC alone (IRR: 1.42 [95% CI, 1.12 to 1.78]) and (IRR: 1.60 [95% CI, 1.26 to 2.04]); THC:CBD combination (IRR: 1.58 [95% CI,1.26 to 1.98]) and (IRR: 1.70 [95% CI,1.24 to 2.33]), respectively. IRRs of SAEs and deaths were not significantly greater under CBMs containing THC with or without CBD. THC:CBD combination (RR: 1.40 [95% CI, 1.08 to 1.80]) but not THC alone (RR: 1.18 [95% CI, 0.89 to 1.57]) significantly increased risk of AE-related withdrawals. CBD alone did not increase the incidence of all-cause AEs (IRR: 1.02 [95% CI, 0.90 to 1.16]) or other outcomes as per qualitative synthesis. AE-related withdrawals were significantly associated with THC dose in THC only [QM (df = 1) = 4.696, p = 0.03] and THC:CBD combination treatment ([QM (df = 1) = 4.554, p = 0.033]. THC-containing CBMs significantly increased incidence of dry mouth, dizziness/light-headedness, and somnolence/drowsiness. Study limitations include inability to fully exclude data from those <50 years of age in our primary analyses as well as limitations related to weaknesses in the included trials particularly incomplete reporting of outcomes and heterogeneity in included studies.
Conclusions
This pooled analysis, using data from RCTs with mean participant age ≥50 years, suggests that although THC-containing CBMs are associated with side effects, CBMs in general are safe and acceptable in older adults. However, THC:CBD combinations may be less acceptable in the dose ranges used and their tolerability may be different in adults over 65 or 75 years of age.



PLoS Med: 27 Feb 2021; 18:e1003524
Velayudhan L, McGoohan K, Bhattacharyya S
PLoS Med: 27 Feb 2021; 18:e1003524 | PMID: 33780450
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Evaluation of the Arabin cervical pessary for prevention of preterm birth in women with a twin pregnancy and short cervix (STOPPIT-2): An open-label randomised trial and updated meta-analysis.

Norman JE, Norrie J, MacLennan G, Cooper D, ... Denton J, STOPPIT-2 collaborative group
Background
Preterm-labour-associated preterm birth is a common cause of perinatal mortality and morbidity in twin pregnancy. We aimed to test the hypothesis that the Arabin pessary would reduce preterm-labour-associated preterm birth by 40% or greater in women with a twin pregnancy and a short cervix.
Methods and findings
We conducted an open-label randomised controlled trial in 57 hospital antenatal clinics in the UK and Europe. From 1 April 2015 to 14 February 2019, 2,228 women with a twin pregnancy underwent cervical length screening between 18 weeks 0 days and 20 weeks 6 days of gestation. In total, 503 women with cervical length ≤ 35 mm were randomly assigned to pessary in addition to standard care (n = 250, mean age 32.4 years, mean cervical length 29 mm, with pessary inserted in 230 women [92.0%]) or standard care alone (n = 253, mean age 32.7 years, mean cervical length 30 mm). The pessary was inserted before 21 completed weeks of gestation and removed at between 35 and 36 weeks or before birth if earlier. The primary obstetric outcome, spontaneous onset of labour and birth before 34 weeks 0 days of gestation, was present in 46/250 (18.4%) in the pessary group compared to 52/253 (20.6%) following standard care alone (adjusted odds ratio [aOR] 0.87 [95% CI 0.55-1.38], p = 0.54). The primary neonatal outcome-a composite of any of stillbirth, neonatal death, periventricular leukomalacia, early respiratory morbidity, intraventricular haemorrhage, necrotising enterocolitis, or proven sepsis, from birth to 28 days after the expected date of delivery-was present in 67/500 infants (13.4%) in the pessary group compared to 76/506 (15.0%) following standard care alone (aOR 0.86 [95% CI 0.54-1.36], p = 0.50). The positive and negative likelihood ratios of a short cervix (≤35 mm) to predict preterm birth before 34 weeks were 2.14 and 0.83, respectively. A meta-analysis of data from existing publications (4 studies, 313 women) and from STOPPIT-2 indicated that a cervical pessary does not reduce preterm birth before 34 weeks in women with a short cervix (risk ratio 0.74 [95% CI 0.50-1.11], p = 0.15). No women died in either arm of the study; 4.4% of babies in the Arabin pessary group and 5.5% of babies in the standard treatment group died in utero or in the neonatal period (p = 0.53). Study limitations include lack of power to exclude a smaller than 40% reduction in preterm labour associated preterm birth, and to be conclusive about subgroup analyses.
Conclusions
These results led us to reject our hypothesis that the Arabin pessary would reduce the risk of the primary outcome by 40%. Smaller treatment effects cannot be ruled out.
Trial registration
ISRCTN Registry ISRCTN 02235181. ClinicalTrials.gov NCT02235181.



PLoS Med: 27 Feb 2021; 18:e1003506
Norman JE, Norrie J, MacLennan G, Cooper D, ... Denton J, STOPPIT-2 collaborative group
PLoS Med: 27 Feb 2021; 18:e1003506 | PMID: 33780463
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Global burden of influenza-associated lower respiratory tract infections and hospitalizations among adults: A systematic review and meta-analysis.

Lafond KE, Porter RM, Whaley MJ, Suizan Z, ... Widdowson MA, Global Respiratory Hospitalizations–Influenza Proportion Positive (GRIPP) Working Group
Background
Influenza illness burden is substantial, particularly among young children, older adults, and those with underlying conditions. Initiatives are underway to develop better global estimates for influenza-associated hospitalizations and deaths. Knowledge gaps remain regarding the role of influenza viruses in severe respiratory disease and hospitalizations among adults, particularly in lower-income settings.
Methods and findings
We aggregated published data from a systematic review and unpublished data from surveillance platforms to generate global meta-analytic estimates for the proportion of acute respiratory hospitalizations associated with influenza viruses among adults. We searched 9 online databases (Medline, Embase, CINAHL, Cochrane Library, Scopus, Global Health, LILACS, WHOLIS, and CNKI; 1 January 1996-31 December 2016) to identify observational studies of influenza-associated hospitalizations in adults, and assessed eligible papers for bias using a simplified Newcastle-Ottawa scale for observational data. We applied meta-analytic proportions to global estimates of lower respiratory infections (LRIs) and hospitalizations from the Global Burden of Disease study in adults ≥20 years and by age groups (20-64 years and ≥65 years) to obtain the number of influenza-associated LRI episodes and hospitalizations for 2016. Data from 63 sources showed that influenza was associated with 14.1% (95% CI 12.1%-16.5%) of acute respiratory hospitalizations among all adults, with no significant differences by age group. The 63 data sources represent published observational studies (n = 28) and unpublished surveillance data (n = 35), from all World Health Organization regions (Africa, n = 8; Americas, n = 11; Eastern Mediterranean, n = 7; Europe, n = 8; Southeast Asia, n = 11; Western Pacific, n = 18). Data quality for published data sources was predominantly moderate or high (75%, n = 56/75). We estimate 32,126,000 (95% CI 20,484,000-46,129,000) influenza-associated LRI episodes and 5,678,000 (95% CI 3,205,000-9,432,000) LRI hospitalizations occur each year among adults. While adults <65 years contribute most influenza-associated LRI hospitalizations and episodes (3,464,000 [95% CI 1,885,000-5,978,000] LRI hospitalizations and 31,087,000 [95% CI 19,987,000-44,444,000] LRI episodes), hospitalization rates were highest in those ≥65 years (437/100,000 person-years [95% CI 265-612/100,000 person-years]). For this analysis, published articles were limited in their inclusion of stratified testing data by year and age group. Lack of information regarding influenza vaccination of the study population was also a limitation across both types of data sources.
Conclusions
In this meta-analysis, we estimated that influenza viruses are associated with over 5 million hospitalizations worldwide per year. Inclusion of both published and unpublished findings allowed for increased power to generate stratified estimates, and improved representation from lower-income countries. Together, the available data demonstrate the importance of influenza viruses as a cause of severe disease and hospitalizations in younger and older adults worldwide.



PLoS Med: 27 Feb 2021; 18:e1003550
Lafond KE, Porter RM, Whaley MJ, Suizan Z, ... Widdowson MA, Global Respiratory Hospitalizations–Influenza Proportion Positive (GRIPP) Working Group
PLoS Med: 27 Feb 2021; 18:e1003550 | PMID: 33647033
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Cardiovascular disease risk profile and management practices in 45 low-income and middle-income countries: A cross-sectional study of nationally representative individual-level survey data.

Peiris D, Ghosh A, Manne-Goehler J, Jaacks LM, ... Davies JI, Geldsetzer P
Background
Global cardiovascular disease (CVD) burden is high and rising, especially in low-income and middle-income countries (LMICs). Focussing on 45 LMICs, we aimed to determine (1) the adult population\'s median 10-year predicted CVD risk, including its variation within countries by socio-demographic characteristics, and (2) the prevalence of self-reported blood pressure (BP) medication use among those with and without an indication for such medication as per World Health Organization (WHO) guidelines.
Methods and findings
We conducted a cross-sectional analysis of nationally representative household surveys from 45 LMICs carried out between 2005 and 2017, with 32 surveys being WHO Stepwise Approach to Surveillance (STEPS) surveys. Country-specific median 10-year CVD risk was calculated using the 2019 WHO CVD Risk Chart Working Group non-laboratory-based equations. BP medication indications were based on the WHO Package of Essential Noncommunicable Disease Interventions guidelines. Regression models examined associations between CVD risk, BP medication use, and socio-demographic characteristics. Our complete case analysis included 600,484 adults from 45 countries. Median 10-year CVD risk (interquartile range [IQR]) for males and females was 2.7% (2.3%-4.2%) and 1.6% (1.3%-2.1%), respectively, with estimates indicating the lowest risk in sub-Saharan Africa and highest in Europe and the Eastern Mediterranean. Higher educational attainment and current employment were associated with lower CVD risk in most countries. Of those indicated for BP medication, the median (IQR) percentage taking medication was 24.2% (15.4%-37.2%) for males and 41.6% (23.9%-53.8%) for females. Conversely, a median (IQR) 47.1% (36.1%-58.6%) of all people taking a BP medication were not indicated for such based on CVD risk status. There was no association between BP medication use and socio-demographic characteristics in most of the 45 study countries. Study limitations include variation in country survey methods, most notably the sample age range and year of data collection, insufficient data to use the laboratory-based CVD risk equations, and an inability to determine past history of a CVD diagnosis.
Conclusions
This study found underuse of guideline-indicated BP medication in people with elevated CVD risk and overuse by people with lower CVD risk. Country-specific targeted policies are needed to help improve the identification and management of those at highest CVD risk.



PLoS Med: 27 Feb 2021; 18:e1003485
Peiris D, Ghosh A, Manne-Goehler J, Jaacks LM, ... Davies JI, Geldsetzer P
PLoS Med: 27 Feb 2021; 18:e1003485 | PMID: 33661979
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Diagnostic accuracy of cervical cancer screening and screening-triage strategies among women living with HIV-1 in Burkina Faso and South Africa: A cohort study.

Kelly HA, Chikandiwa A, Sawadogo B, Gilham C, ... Mayaud P, HARP Study Group
Background
Cervical cancer screening strategies using visual inspection or cytology may have suboptimal diagnostic accuracy for detection of precancer in women living with HIV (WLHIV). The optimal screen and screen-triage strategy, age to initiate, and frequency of screening for WLHIV remain unclear. This study evaluated the sensitivity, specificity, and positive predictive value of different cervical cancer strategies in WLHIV in Africa.
Methods and findings
WLHIV aged 25-50 years attending HIV treatment centres in Burkina Faso (BF) and South Africa (SA) from 5 December 2011 to 30 October 2012 were enrolled in a prospective evaluation study of visual inspection using acetic acid (VIA) or visual inspection using Lugol\'s iodine (VILI), high-risk human papillomavirus DNA test (Hybrid Capture 2 [HC2] or careHPV), and cytology for histology-verified high-grade cervical intraepithelial neoplasia (CIN2+/CIN3+) at baseline and endline, a median 16 months later. Among 1,238 women (BF: 615; SA: 623), median age was 36 and 34 years (p < 0.001), 28.6% and 49.6% ever had prior cervical cancer screening (p < 0.001), and 69.9% and 64.2% were taking ART at enrolment (p = 0.045) in BF and SA, respectively. CIN2+ prevalence was 5.8% and 22.4% in BF and SA (p < 0.001), respectively. VIA had low sensitivity for CIN2+ (44.7%, 95% confidence interval [CI] 36.9%-52.7%) and CIN3+ (56.1%, 95% CI 43.3%-68.3%) in both countries, with specificity for ≤CIN1 of 78.7% (95% CI 76.0%-81.3%). HC2 had sensitivity of 88.8% (95% CI 82.9%-93.2%) for CIN2+ and 86.4% (95% CI 75.7%-93.6%) for CIN3+. Specificity for ≤CIN1 was 55.4% (95% CI 52.2%-58.6%), and screen positivity was 51.3%. Specificity was higher with a restricted genotype (HPV16/18/31/33/35/45/52/58) approach (73.5%, 95% CI 70.6%-76.2%), with lower screen positivity (33.7%), although there was lower sensitivity for CIN3+ (77.3%, 95% CI 65.3%-86.7%). In BF, HC2 was more sensitive for CIN2+/CIN3+ compared to VIA/VILI (relative sensitivity for CIN2+ = 1.72, 95% CI 1.28-2.32; CIN3+: 1.18, 95% CI 0.94-1.49). Triage of HC2-positive women with VIA/VILI reduced the number of colposcopy referrals, but with loss in sensitivity for CIN2+ (58.1%) but not for CIN3+ (84.6%). In SA, cytology high-grade squamous intraepithelial lesion or greater (HSIL+) had best combination of sensitivity (CIN2+: 70.1%, 95% CI 61.3%-77.9%; CIN3+: 80.8%, 95% CI 67.5%-90.4%) and specificity (81.6%, 95% CI 77.6%-85.1%). HC2 had similar sensitivity for CIN3+ (83.0%, 95% CI 70.2%-91.9%) but lower specificity compared to HSIL+ (42.7%, 95% CI 38.4%-47.1%; relative specificity = 0.57, 95% CI 0.52-0.63), resulting in almost twice as many referrals. Compared to HC2, triage of HC2-positive women with HSIL+ resulted in a 40% reduction in colposcopy referrals but was associated with some loss in sensitivity. CIN2+ incidence over a median 16 months was highest among VIA baseline screen-negative women (2.2%, 95% CI 1.3%-3.7%) and women who were baseline double-negative with HC2 and VIA (2.1%, 95% CI 1.3%-3.5%) and lowest among HC2 baseline screen-negative women (0.5%, 95% CI 0.1%-1.8%). Limitations of our study are that WLHIV included in the study may not reflect a contemporary cohort of WLHIV initiating ART in the universal ART era and that we did not evaluate HPV tests available in study settings today.
Conclusions
In this cohort study among WLHIV in Africa, a human papillomavirus (HPV) test targeting 14 high-risk (HR) types had higher sensitivity to detect CIN2+ compared to visual inspection but had low specificity, although a restricted genotype approach targeting 8 HR types decreased the number of unnecessary colposcopy referrals. Cytology HSIL+ had optimal performance for CIN2+/CIN3+ detection in SA. Triage of HPV-positive women with HSIL+ maintained high specificity but with some loss in sensitivity compared to HC2 alone.



PLoS Med: 27 Feb 2021; 18:e1003528
Kelly HA, Chikandiwa A, Sawadogo B, Gilham C, ... Mayaud P, HARP Study Group
PLoS Med: 27 Feb 2021; 18:e1003528 | PMID: 33661957
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Applicability and cost-effectiveness of the Systolic Blood Pressure Intervention Trial (SPRINT) in the Chinese population: A cost-effectiveness modeling study.

Li C, Chen K, Cornelius V, Tomeny E, ... Wang D, Chen T
Background
The Systolic Blood Pressure Intervention Trial (SPRINT) showed significant reductions in death and cardiovascular disease (CVD) risk with a systolic blood pressure (SBP) goal of <120 mm Hg compared with a SBP goal of <140 mm Hg. Our study aimed to assess the applicability of SPRINT to Chinese adults. Additionally, we sought to predict the medical and economic implications of this intensive SBP treatment among those meeting SPRINT eligibility.
Methods and findings
We used nationally representative baseline data from the China Health and Retirement Longitudinal Study (CHARLS) (2011-2012) to estimate the prevalence and number of Chinese adults aged 45 years and older who meet SPRINT criteria. A validated microsimulation model was employed to project costs, clinical outcomes, and quality-adjusted life-years (QALYs) among SPRINT-eligible adults, under 2 alternative treatment strategies (SBP goal of <120 mm Hg [intensive treatment] and SBP goal of <140 mm Hg [standard treatment]). Overall, 22.2% met the SPRINT criteria, representing 116.2 (95% CI 107.5 to 124.8) million people in China. Of these, 66.4%, representing 77.2 (95% CI 69.3 to 85.0) million, were not being treated for hypertension, and 22.9%, representing 26.6 (95% CI 22.4 to 30.7) million, had a SBP between 130 and 139 mm Hg, yet were not taking antihypertensive medication. We estimated that over 5 years, compared to standard treatment, intensive treatment would reduce heart failure incidence by 0.84 (95% CI 0.42 to 1.25) million cases, reduce CVD deaths by 2.03 (95% CI 1.44 to 2.63) million cases, and save 3.84 (95% CI 1.53 to 6.34) million life-years. Estimated reductions of 0.069 (95% CI -0.28, 0.42) million myocardial infarction cases and 0.36 (95% CI -0.10, 0.82) million stroke cases were not statistically significant. Furthermore, over a lifetime, moving from standard to intensive treatment increased the mean QALYs from 9.51 to 9.87 (an increment of 0.38 [95% CI 0.13 to 0.71]), at a cost of Int$10,997 per QALY gained. Of all 1-way sensitivity analyses, high antihypertensive drug cost and lower treatment efficacy for CVD death resulted in the 2 most unfavorable results (Int$25,291 and Int$18,995 per QALY were gained, respectively). Simulation results indicated that intensive treatment could be cost-effective (82.8% probability of being below the willingness-to-pay threshold of Int$16,782 [1× GDP per capita in China in 2017]), with a lower probability in people with SBP 130-139 mm Hg (72.9%) but a higher probability among females (91.2%). Main limitations include lack of specific SPRINT eligibility information in the CHARLS survey, uncertainty about the implications of different blood pressure measurement techniques, the use of several sources of data with large reliance on findings from SPPRINT, limited information about the serious adverse event rate, and lack of information and evidence for medication effectiveness on renal disease.
Conclusions
Although adoption of the SPRINT treatment strategy would increase the number of Chinese adults requiring SBP treatment intensification, this approach has the potential to prevent CVD events, to produce gains in life-years, and to be cost-effective under common thresholds.



PLoS Med: 27 Feb 2021; 18:e1003515
Li C, Chen K, Cornelius V, Tomeny E, ... Wang D, Chen T
PLoS Med: 27 Feb 2021; 18:e1003515 | PMID: 33661907
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Probabilistic seasonal dengue forecasting in Vietnam: A modelling study using superensembles.

Colón-González FJ, Soares Bastos L, Hofmann B, Hopkin A, ... Brady OJ, Lowe R
Background
With enough advanced notice, dengue outbreaks can be mitigated. As a climate-sensitive disease, environmental conditions and past patterns of dengue can be used to make predictions about future outbreak risk. These predictions improve public health planning and decision-making to ultimately reduce the burden of disease. Past approaches to dengue forecasting have used seasonal climate forecasts, but the predictive ability of a system using different lead times in a year-round prediction system has been seldom explored. Moreover, the transition from theoretical to operational systems integrated with disease control activities is rare.
Methods and findings
We introduce an operational seasonal dengue forecasting system for Vietnam where Earth observations, seasonal climate forecasts, and lagged dengue cases are used to drive a superensemble of probabilistic dengue models to predict dengue risk up to 6 months ahead. Bayesian spatiotemporal models were fit to 19 years (2002-2020) of dengue data at the province level across Vietnam. A superensemble of these models then makes probabilistic predictions of dengue incidence at various future time points aligned with key Vietnamese decision and planning deadlines. We demonstrate that the superensemble generates more accurate predictions of dengue incidence than the individual models it incorporates across a suite of time horizons and transmission settings. Using historical data, the superensemble made slightly more accurate predictions (continuous rank probability score [CRPS] = 66.8, 95% CI 60.6-148.0) than a baseline model which forecasts the same incidence rate every month (CRPS = 79.4, 95% CI 78.5-80.5) at lead times of 1 to 3 months, albeit with larger uncertainty. The outbreak detection capability of the superensemble was considerably larger (69%) than that of the baseline model (54.5%). Predictions were most accurate in southern Vietnam, an area that experiences semi-regular seasonal dengue transmission. The system also demonstrated added value across multiple areas compared to previous practice of not using a forecast. We use the system to make a prospective prediction for dengue incidence in Vietnam for the period May to October 2020. Prospective predictions made with the superensemble were slightly more accurate (CRPS = 110, 95% CI 102-575) than those made with the baseline model (CRPS = 125, 95% CI 120-168) but had larger uncertainty. Finally, we propose a framework for the evaluation of probabilistic predictions. Despite the demonstrated value of our forecasting system, the approach is limited by the consistency of the dengue case data, as well as the lack of publicly available, continuous, and long-term data sets on mosquito control efforts and serotype-specific case data.
Conclusions
This study shows that by combining detailed Earth observation data, seasonal climate forecasts, and state-of-the-art models, dengue outbreaks can be predicted across a broad range of settings, with enough lead time to meaningfully inform dengue control. While our system omits some important variables not currently available at a subnational scale, the majority of past outbreaks could be predicted up to 3 months ahead. Over the next 2 years, the system will be prospectively evaluated and, if successful, potentially extended to other areas and other climate-sensitive disease systems.



PLoS Med: 27 Feb 2021; 18:e1003542
Colón-González FJ, Soares Bastos L, Hofmann B, Hopkin A, ... Brady OJ, Lowe R
PLoS Med: 27 Feb 2021; 18:e1003542 | PMID: 33661904
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Parental death in childhood and pathways to increased mortality across the life course in Stockholm, Sweden: A cohort study.

Hiyoshi A, Berg L, Grotta A, Almquist Y, Rostila M
Background
Previous studies have shown that the experience of parental death during childhood is associated with increased mortality risk. However, few studies have examined potential pathways that may explain these findings. The aim of this study is to examine whether familial and behavioural factors during adolescence and socioeconomic disadvantages in early adulthood mediate the association between loss of a parent at age 0 to 12 and all-cause mortality by the age of 63.
Methods and findings
A cohort study was conducted using data from the Stockholm Birth Cohort Multigenerational Study for 12,615 children born in 1953, with information covering 1953 to 2016. Familial and behavioural factors at age 13 to 19 included psychiatric and alcohol problems in the surviving parent, receipt of social assistance, and delinquent behaviour in the offspring. Socioeconomic disadvantage in early adulthood included educational attainment, occupational social class, and income at age 27 to 37. We used Cox proportional hazard regression models, combined with a multimediator analysis, to separate direct and indirect effects of parental death on all-cause mortality. Among the 12,582 offspring in the study (men 51%; women 49%), about 3% experienced the death of a parent in childhood. During follow-up from the age of 38 to 63, there were 935 deaths among offspring. Parental death was associated with an elevated risk of mortality after adjusting for demographic and household socioeconomic characteristics at birth (hazard ratio [HR]: 1.52 [95% confidence interval: 1.10 to 2.08, p-value = 0.010]). Delinquent behaviour in adolescence and income during early adulthood were the most influential mediators, and the indirect associations through these variables were HR 1.03 (1.00 to 1.06, 0.029) and HR 1.04 (1.01 to 1.07, 0.029), respectively. After accounting for these indirect paths, the direct path was attenuated to HR 1.35 (0.98 to 1.85, 0.066). The limitations of the study include that the associations may be partly due to genetic, social, and behavioural residual confounding, that statistical power was low in some of the subgroup analyses, and that there might be other relevant paths that were not investigated in the present study.
Conclusions
Our findings from this cohort study suggest that childhood parental death is associated with increased mortality and that the association was mediated through a chain of disadvantages over the life course including delinquency in adolescence and lower income during early adulthood. Professionals working with bereaved children should take the higher mortality risk in bereaved offspring into account and consider its lifelong consequences. When planning and providing support to bereaved children, it may be particularly important to be aware of their increased susceptibility to delinquency and socioeconomic vulnerability that eventually lead to higher mortality.



PLoS Med: 27 Feb 2021; 18:e1003549
Hiyoshi A, Berg L, Grotta A, Almquist Y, Rostila M
PLoS Med: 27 Feb 2021; 18:e1003549 | PMID: 33705393
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Human papillomavirus vaccination for adults aged 30 to 45 years in the United States: A cost-effectiveness analysis.

Kim JJ, Simms KT, Killen J, Smith MA, ... Regan C, Canfell K
Background
A nonavalent human papillomavirus (HPV) vaccine has been licensed for use in women and men up to age 45 years in the United States. The cost-effectiveness of HPV vaccination for women and men aged 30 to 45 years in the context of cervical cancer screening practice was evaluated to inform national guidelines.
Methods and findings
We utilized 2 independent HPV microsimulation models to evaluate the cost-effectiveness of extending the upper age limit of HPV vaccination in women (from age 26 years) and men (from age 21 years) up to age 30, 35, 40, or 45 years. The models were empirically calibrated to reflect the burden of HPV and related cancers in the US population and used standardized inputs regarding historical and future vaccination uptake, vaccine efficacy, cervical cancer screening, and costs. Disease outcomes included cervical, anal, oropharyngeal, vulvar, vaginal, and penile cancers, as well as genital warts. Both models projected higher costs and greater health benefits as the upper age limit of HPV vaccination increased. Strategies of vaccinating females and males up to ages 30, 35, and 40 years were found to be less cost-effective than vaccinating up to age 45 years, which had an incremental cost-effectiveness ratio (ICER) greater than a commonly accepted upper threshold of $200,000 per quality-adjusted life year (QALY) gained. When including all HPV-related outcomes, the ICER for vaccinating up to age 45 years ranged from $315,700 to $440,600 per QALY gained. Assumptions regarding cervical screening compliance, vaccine costs, and the natural history of noncervical HPV-related cancers had major impacts on the cost-effectiveness of the vaccination strategies. Key limitations of the study were related to uncertainties in the data used to inform the models, including the timing of vaccine impact on noncervical cancers and vaccine efficacy at older ages.
Conclusions
Our results from 2 independent models suggest that HPV vaccination for adult women and men aged 30 to 45 years is unlikely to represent good value for money in the US.



PLoS Med: 27 Feb 2021; 18:e1003534
Kim JJ, Simms KT, Killen J, Smith MA, ... Regan C, Canfell K
PLoS Med: 27 Feb 2021; 18:e1003534 | PMID: 33705382
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Factors associated with suicide risk among Chinese adults: A prospective cohort study of 0.5 million individuals.

Yu R, Chen Y, Li L, Chen J, ... Chen Z, Fazel S
Background
Suicide is a leading cause of death in China and accounts for about one-sixth of all suicides worldwide. The objective of this study was to examine the recent distribution of suicide and risk factors for death by suicide. Identifying underlying risk factors could benefit development of evidence-based prevention and intervention programs.
Methods and findings
We conducted a prospective study, the China Kadoorie Biobank, of 512,715 individuals (41% men, mean age 52 years) from 10 (5 urban, 5 rural) areas which are diverse across China in geographic locations, social economic developmental stages, and prevalence of disease patterns. After the baseline measurements of risk factors during 2004 to 2008, participants were followed up for suicide outcomes including suicide and possible suicide deaths. Risk factors, such as sociodemographic factors and physical and mental health status, were assessed by semistructured interviews and self-report questionnaires. Suicide and possible suicide deaths were identified through linkage to the local death registries using ICD-10 codes. We conducted Cox regression to calculate hazard ratios (HRs) for suicide and for possible suicide in sensitivity analyses. During an average follow-up period of 9.9 years, 520 (101 per 100,000) people died from suicide (51.3% male), and 79.8% of them lived in rural areas. Sociodemographic factors associated with increased suicide risk were male gender (adjusted hazard ratios [aHR] = 1.6 [95% CI 1.4 to 2.0], p < 0.001), older age (1.3 [1.2 to 1.5] by each 10-yr increase, p < 0.001), rural residence (2.6 [2.1 to 3.3], p < 0.001), and single status (1.7 [1.4 to 2.2], p < 0.001). Increased hazards were found for family-related stressful life events (aHR = 1.8 [1.2 to 1.9], p < 0.001) and for major physical illnesses (1.5 [1.3 to 1.9], p < 0.001). There were strong associations of suicide with a history of lifetime mental disorders (aHR = 9.6 [5.9 to 15.6], p < 0.001) and lifetime schizophrenia-spectrum disorders (11.0 [7.1 to 17.0], p < 0.001). Links between suicide risk and depressive disorders (aHR = 2.6 [1.4 to 4.8], p = 0.002) and generalized anxiety disorders (2.6 [1.0 to 7.1], p = 0.056) in the last 12 months, and sleep disorders (1.4 [1.2 to 1.7], p < 0.001) in the past month were also found. All HRs were adjusted for sociodemographic factors including gender, age, residence, single status, education, and income. The associations with possible suicide deaths were mostly similar to those with suicide deaths, although there was no clear link between possible suicide deaths and psychiatric factors such as depression and generalized anxiety disorders. A limitation of the study is that there is likely underreporting of mental disorders due to the use of self-report information for some diagnostic categories.
Conclusions
In this study, we observed that a range of sociodemographic, lifestyle, stressful life events, physical, and mental health factors were associated with suicide in China. High-risk groups identified were elderly men in rural settings and individuals with mental disorders. These findings could form the basis of targeted approaches to reduce suicide mortality in China.



PLoS Med: 27 Feb 2021; 18:e1003545
Yu R, Chen Y, Li L, Chen J, ... Chen Z, Fazel S
PLoS Med: 27 Feb 2021; 18:e1003545 | PMID: 33705376
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

The potential shared role of inflammation in insulin resistance and schizophrenia: A bidirectional two-sample mendelian randomization study.

Perry BI, Burgess S, Jones HJ, Zammit S, ... Jones PB, Khandaker GM
Background
Insulin resistance predisposes to cardiometabolic disorders, which are commonly comorbid with schizophrenia and are key contributors to the significant excess mortality in schizophrenia. Mechanisms for the comorbidity remain unclear, but observational studies have implicated inflammation in both schizophrenia and cardiometabolic disorders separately. We aimed to examine whether there is genetic evidence that insulin resistance and 7 related cardiometabolic traits may be causally associated with schizophrenia, and whether evidence supports inflammation as a common mechanism for cardiometabolic disorders and schizophrenia.
Methods and findings
We used summary data from genome-wide association studies of mostly European adults from large consortia (Meta-Analyses of Glucose and Insulin-related traits Consortium (MAGIC) featuring up to 108,557 participants; Diabetes Genetics Replication And Meta-analysis (DIAGRAM) featuring up to 435,387 participants; Global Lipids Genetics Consortium (GLGC) featuring up to 173,082 participants; Genetic Investigation of Anthropometric Traits (GIANT) featuring up to 339,224 participants; Psychiatric Genomics Consortium (PGC) featuring up to 105,318 participants; and Cohorts for Heart and Aging Research in Genomic Epidemiology (CHARGE) consortium featuring up to 204,402 participants). We conducted two-sample uni- and multivariable mendelian randomization (MR) analysis to test whether (i) 10 cardiometabolic traits (fasting insulin, high-density lipoprotein and triglycerides representing an insulin resistance phenotype, and 7 related cardiometabolic traits: low-density lipoprotein, fasting plasma glucose, glycated haemoglobin, leptin, body mass index, glucose tolerance, and type 2 diabetes) could be causally associated with schizophrenia; and (ii) inflammation could be a shared mechanism for these phenotypes. We conducted a detailed set of sensitivity analyses to test the assumptions for a valid MR analysis. We did not find statistically significant evidence in support of a causal relationship between cardiometabolic traits and schizophrenia, or vice versa. However, we report that a genetically predicted inflammation-related insulin resistance phenotype (raised fasting insulin (raised fasting insulin (Wald ratio OR = 2.95, 95% C.I, 1.38-6.34, Holm-Bonferroni corrected p-value (p) = 0.035) and lower high-density lipoprotein (Wald ratio OR = 0.55, 95% C.I., 0.36-0.84; p = 0.035)) was associated with schizophrenia. Evidence for these associations attenuated to the null in multivariable MR analyses after adjusting for C-reactive protein, an archetypal inflammatory marker: (fasting insulin Wald ratio OR = 1.02, 95% C.I, 0.37-2.78, p = 0.975), high-density lipoprotein (Wald ratio OR = 1.00, 95% C.I., 0.85-1.16; p = 0.849), suggesting that the associations could be fully explained by inflammation. One potential limitation of the study is that the full range of gene products from the genetic variants we used as proxies for the exposures is unknown, and so we are unable to comment on potential biological mechanisms of association other than inflammation, which may also be relevant.
Conclusions
Our findings support a role for inflammation as a common cause for insulin resistance and schizophrenia, which may at least partly explain why the traits commonly co-occur in clinical practice. Inflammation and immune pathways may represent novel therapeutic targets for the prevention or treatment of schizophrenia and comorbid insulin resistance. Future work is needed to understand how inflammation may contribute to the risk of schizophrenia and insulin resistance.



PLoS Med: 27 Feb 2021; 18:e1003455
Perry BI, Burgess S, Jones HJ, Zammit S, ... Jones PB, Khandaker GM
PLoS Med: 27 Feb 2021; 18:e1003455 | PMID: 33711016
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Association of genetic liability to smoking initiation with e-cigarette use in young adults: A cohort study.

Khouja JN, Wootton RE, Taylor AE, Davey Smith G, Munafò MR
Background
Tobacco smoking and e-cigarette use are strongly associated, but it is currently unclear whether this association is causal, or due to shared factors that influence both behaviours such as a shared genetic liability. The aim of this study was to investigate whether polygenic risk scores (PRS) for smoking initiation are associated with ever use of e-cigarettes.
Methods and findings
Smoking initiation PRS were calculated for young adults (N = 7,859, mean age = 24 years, 51% male) of European ancestry in the Avon Longitudinal Study of Parents and Children, a prospective birth cohort study initiated in 1991. PRS were calculated using the GWAS & Sequencing Consortium of Alcohol and Nicotine use (GSCAN) summary statistics. Five thresholds ranging from 5 × 10-8 to 0.5 were used to calculate 5 PRS for each individual. Using logistic regression, we investigated the association between smoking initiation PRS and the main outcome, self-reported e-cigarette use (n = 2,894, measured between 2016 and 2017), as well as self-reported smoking initiation and 8 negative control outcomes (socioeconomic position at birth, externalising disorders in childhood, and risk-taking in young adulthood). A total of 878 young adults (30%) had ever used e-cigarettes at 24 years, and 150 (5%) were regular e-cigarette users at 24 years. We observed positive associations of similar magnitude between smoking initiation PRS (created using the p < 5 × 10-8 threshold) and both smoking initiation (odds ratio (OR) = 1.29, 95% CI 1.19 to 1.39, p < 0.001) and ever e-cigarette use (OR = 1.24, 95% CI 1.14 to 1.34, p < 0.001) by the age of 24 years, indicating that a genetic predisposition to smoking initiation is associated with an increased risk of using e-cigarettes. At lower p-value thresholds, we observed an association between smoking initiation PRS and ever e-cigarette use among never smokers. We also found evidence of associations between smoking initiation PRS and some negative control outcomes, particularly when less stringent p-value thresholds were used to create the PRS, but also at the strictest threshold (e.g., gambling, number of sexual partners, conduct disorder at 7 years, and parental socioeconomic position at birth). However, this study is limited by the relatively small sample size and potential for collider bias.
Conclusions
Our results indicate that there may be a shared genetic aetiology between smoking and e-cigarette use, and also with socioeconomic position, externalising disorders in childhood, and risky behaviour more generally. This indicates that there may be a common genetic vulnerability to both smoking and e-cigarette use, which may reflect a broad risk-taking phenotype.



PLoS Med: 27 Feb 2021; 18:e1003555
Khouja JN, Wootton RE, Taylor AE, Davey Smith G, Munafò MR
PLoS Med: 27 Feb 2021; 18:e1003555 | PMID: 33735204
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Variation in HIV care and treatment outcomes by facility in South Africa, 2011-2015: A cohort study.

Bor J, Gage A, Onoya D, Maskew M, ... Mlisana K, MacLeod W
Background
Despite widespread availability of HIV treatment, patient outcomes differ across facilities. We propose and evaluate an approach to measure quality of HIV care at health facilities in South Africa\'s national HIV program using routine laboratory data.
Methods and findings
Data were extracted from South Africa\'s National Health Laboratory Service (NHLS) Corporate Data Warehouse. All CD4 counts, viral loads (VLs), and other laboratory tests used in HIV monitoring were linked, creating a validated patient identifier. We constructed longitudinal HIV care cascades for all patients in the national HIV program, excluding data from the Western Cape and very small facilities. We then estimated for each facility in each year (2011 to 2015) the following cascade measures identified a priori as reflecting quality of HIV care: median CD4 count among new patients; retention 12 months after presentation; 12-month retention among patients established in care; viral suppression; CD4 recovery; monitoring after an elevated VL. We used factor analysis to identify an underlying measure of quality of care, and we assessed the persistence of this quality measure over time. We then assessed spatiotemporal variation and facility and population predictors in a multivariable regression context. We analyzed data on 3,265 facilities with a median (IQR) annual size of 441 (189 to 988) lab-monitored HIV patients. Retention 12 months after presentation increased from 42% to 47% during the study period, and viral suppression increased from 66% to 79%, although there was substantial variability across facilities. We identified an underlying measure of quality of HIV care that correlated with all cascade measures except median CD4 count at presentation. Averaging across the 5 years of data, this quality score attained a reliability of 0.84. Quality was higher for clinics (versus hospitals), in rural (versus urban) areas, and for larger facilities. Quality was lower in high-poverty areas but was not independently associated with percent Black. Quality increased by 0.49 (95% CI 0.46 to 0.53) standard deviations from 2011 to 2015, and there was evidence of geospatial autocorrelation (p < 0.001). The study\'s limitations include an inability to fully adjust for underlying patient risk, reliance on laboratory data which do not capture all relevant domains of quality, potential for errors in record linkage, and the omission of Western Cape.
Conclusions
We observed persistent differences in HIV care and treatment outcomes across South African facilities. Targeting low-performing facilities for additional support could reduce overall burden of disease.



PLoS Med: 27 Feb 2021; 18:e1003479
Bor J, Gage A, Onoya D, Maskew M, ... Mlisana K, MacLeod W
PLoS Med: 27 Feb 2021; 18:e1003479 | PMID: 33789340
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:

This program is still in alpha version.