Journal: PLoS Med

Sorted by: date / impact
Abstract

Selective serotonin reuptake inhibitors, and serotonin and norepinephrine reuptake inhibitors for anxiety, obsessive-compulsive, and stress disorders: A 3-level network meta-analysis.

Gosmann NP, Costa MA, Jaeger MB, Motta LS, ... Pine DS, Salum GA
Background
Anxiety, obsessive-compulsive, and stress-related disorders frequently co-occur, and patients often present symptoms of several domains. Treatment involves the use of selective serotonin reuptake inhibitors (SSRIs) and serotonin and norepinephrine reuptake inhibitors (SNRIs), but data on comparative efficacy and acceptability are lacking. We aimed to compare the efficacy of SSRIs, SNRIs, and placebo in multiple symptom domains in patients with these diagnoses over the lifespan through a 3-level network meta-analysis.
Methods and findings
We searched for published and unpublished randomized controlled trials that aimed to assess the efficacy of SSRIs or SNRIs in participants (adults and children) with diagnosis of any anxiety, obsessive-compulsive, or stress-related disorder in MEDLINE, PsycINFO, Embase, and Cochrane Library from inception to 23 April 2015, with an update on 11 November 2020. We supplemented electronic database searches with manual searches for published and unpublished randomized controlled trials registered in publicly accessible clinical trial registries and pharmaceutical companies\' databases. No restriction was made regarding comorbidities with any other mental disorder, participants\' age and sex, blinding of participants and researchers, date of publication, or study language. The primary outcome was the aggregate measure of internalizing symptoms of these disorders. Secondary outcomes included specific symptom domains and treatment discontinuation rate. We estimated standardized mean differences (SMDs) with 3-level network meta-analysis with random slopes by study for medication and assessment instrument. Risk of bias appraisal was performed using the Cochrane Collaboration\'s risk of bias tool. This study was registered in PROSPERO (CRD42017069090). We analyzed 469 outcome measures from 135 studies (n = 30,245). Medication (SSRI or SNRI) was more effective than placebo for the aggregate measure of internalizing symptoms (SMD -0.56, 95% CI -0.62 to -0.51, p < 0.001), for all symptom domains, and in patients from all diagnostic categories. We also found significant results when restricting to the most used assessment instrument for each diagnosis; nevertheless, this restriction led to exclusion of 72.71% of outcome measures. Pairwise comparisons revealed only small differences between medications in efficacy and acceptability. Limitations include the moderate heterogeneity found in most outcomes and the moderate risk of bias identified in most of the trials.
Conclusions
In this study, we observed that SSRIs and SNRIs were effective for multiple symptom domains, and in patients from all included diagnostic categories. We found minimal differences between medications concerning efficacy and acceptability. This 3-level network meta-analysis contributes robust evidence to the ongoing discussion about the true benefit of antidepressants, with a significantly larger quantity of data and higher statistical power than previous studies. The 3-level approach allowed us to properly assess the efficacy of these medications on internalizing psychopathology, avoiding potential biases related to the exclusion of information due to distinct assessment instruments, and to explore the multilevel structure of transdiagnostic efficacy.



PLoS Med: 09 Jun 2021; 18:e1003664
Gosmann NP, Costa MA, Jaeger MB, Motta LS, ... Pine DS, Salum GA
PLoS Med: 09 Jun 2021; 18:e1003664 | PMID: 34111122
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Association between exercise habits and stroke, heart failure, and mortality in Korean patients with incident atrial fibrillation: A nationwide population-based cohort study.

Ahn HJ, Lee SR, Choi EK, Han KD, ... Oh S, Lip GYH
Background
There is a paucity of information about cardiovascular outcomes related to exercise habit change after a new diagnosis of atrial fibrillation (AF). We investigated the association between exercise habits after a new AF diagnosis and ischemic stroke, heart failure (HF), and all-cause death.
Methods and findings
This is a nationwide population-based cohort study using data from the Korea National Health Insurance Service. A retrospective analysis was performed for 66,692 patients with newly diagnosed AF between 2010 and 2016 who underwent 2 serial health examinations within 2 years before and after their AF diagnosis. Individuals were divided into 4 categories according to performance of regular exercise, which was investigated by a self-reported questionnaire in each health examination, before and after their AF diagnosis: persistent non-exercisers (30.5%), new exercisers (17.8%), exercise dropouts (17.4%), and exercise maintainers (34.2%). The primary outcomes were incidence of ischemic stroke, HF, and all-cause death. Differences in baseline characteristics among groups were balanced considering demographics, comorbidities, medications, lifestyle behaviors, and income status. The risks of the outcomes were computed by weighted Cox proportional hazards models with inverse probability of treatment weighting (IPTW) during a mean follow-up of 3.4 ± 2.0 years. The new exerciser and exercise maintainer groups were associated with a lower risk of HF compared to the persistent non-exerciser group: the hazard ratios (HRs) (95% CIs) were 0.95 (0.90-0.99) and 0.92 (0.88-0.96), respectively (p < 0.001). Also, performing exercise any time before or after AF diagnosis was associated with a lower risk of mortality compared to persistent non-exercising: the HR (95% CI) was 0.82 (0.73-0.91) for new exercisers, 0.83 (0.74-0.93) for exercise dropouts, and 0.61 (0.55-0.67) for exercise maintainers (p < 0.001). For ischemic stroke, the estimates of HRs were 10%-14% lower in patients of the exercise groups, yet differences were statistically insignificant (p = 0.057). Energy expenditure of 1,000-1,499 MET-min/wk (regular moderate exercise 170-240 min/wk) was consistently associated with a lower risk of each outcome based on a subgroup analysis of the new exerciser group. Study limitations include recall bias introduced due to the nature of the self-reported questionnaire and restricted external generalizability to other ethnic groups.
Conclusions
Initiating or continuing regular exercise after AF diagnosis was associated with lower risks of HF and mortality. The promotion of exercise might reduce the future risk of adverse outcomes in patients with AF.



PLoS Med: 07 Jun 2021; 18:e1003659
Ahn HJ, Lee SR, Choi EK, Han KD, ... Oh S, Lip GYH
PLoS Med: 07 Jun 2021; 18:e1003659 | PMID: 34101730
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Optimal protamine dosing after cardiopulmonary bypass: The PRODOSE adaptive randomised controlled trial.

Miles LF, Burt C, Arrowsmith J, McKie MA, ... De Silva R, Falter F
Background
The dose of protamine required following cardiopulmonary bypass (CPB) is often determined by the dose of heparin required pre-CPB, expressed as a fixed ratio. Dosing based on mathematical models of heparin clearance is postulated to improve protamine dosing precision and coagulation. We hypothesised that protamine dosing based on a 2-compartment model would improve thromboelastography (TEG) parameters and reduce the dose of protamine administered, relative to a fixed ratio.
Methods and findings
We undertook a 2-stage, adaptive randomised controlled trial, allocating 228 participants to receive protamine dosed according to a mathematical model of heparin clearance or a fixed ratio of 1 mg of protamine for every 100 IU of heparin required to establish anticoagulation pre-CPB. A planned, blinded interim analysis was undertaken after the recruitment of 50% of the study cohort. Following this, the randomisation ratio was adapted from 1:1 to 1:1.33 to increase recruitment to the superior arm while maintaining study power. At the conclusion of trial recruitment, we had randomised 121 patients to the intervention arm and 107 patients to the control arm. The primary endpoint was kaolin TEG r-time measured 3 minutes after protamine administration at the end of CPB. Secondary endpoints included ratio of kaolin TEG r-time pre-CPB to the same metric following protamine administration, requirement for allogeneic red cell transfusion, intercostal catheter drainage at 4 hours postoperatively, and the requirement for reoperation due to bleeding. The trial was listed on a clinical trial registry (ClinicalTrials.gov Identifier: NCT03532594). Participants were recruited between April 2018 and August 2019. Those in the intervention/model group had a shorter mean kaolin r-time (6.58 [SD 2.50] vs. 8.08 [SD 3.98] minutes; p = 0.0016) post-CPB. The post-protamine thromboelastogram of the model group was closer to pre-CPB parameters (median pre-CPB to post-protamine kaolin r-time ratio 0.96 [IQR 0.78-1.14] vs. 0.75 [IQR 0.57-0.99]; p < 0.001). We found no evidence of a difference in median mediastinal/pleural drainage at 4 hours postoperatively (140 [IQR 75-245] vs. 135 [IQR 94-222] mL; p = 0.85) or requirement (as a binary outcome) for packed red blood cell transfusion at 24 hours postoperatively (19 [15.8%] vs. 14 [13.1%] p = 0.69). Those in the model group had a lower median protamine dose (180 [IQR 160-210] vs. 280 [IQR 250-300] mg; p < 0.001). Important limitations of this study include an unblinded design and lack of generalisability to certain populations deliberately excluded from the study (specifically children, patients with a total body weight >120 kg, and patients requiring therapeutic hypothermia to <28°C).
Conclusions
Using a mathematical model to guide protamine dosing in patients following CPB improved TEG r-time and reduced the dose administered relative to a fixed ratio. No differences were detected in postoperative mediastinal/pleural drainage or red blood cell transfusion requirement in our cohort of low-risk patients.
Trial registration
ClinicalTrials.gov Unique identifier NCT03532594.



PLoS Med: 06 Jun 2021; 18:e1003658
Miles LF, Burt C, Arrowsmith J, McKie MA, ... De Silva R, Falter F
PLoS Med: 06 Jun 2021; 18:e1003658 | PMID: 34097705
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Human papillomavirus seroprevalence in pregnant women following gender-neutral and girls-only vaccination programs in Finland: A cross-sectional cohort analysis following a cluster randomized trial.

Gray P, Kann H, Pimenoff VN, Eriksson T, ... Dillner J, Lehtinen M
Background
Cervical cancer elimination through human papillomavirus (HPV) vaccination programs requires the attainment of herd effect. Due to its uniquely high basic reproduction number, the vaccination coverage required to achieve herd effect against HPV type 16 exceeds what is attainable in most populations. We have compared how gender-neutral and girls-only vaccination strategies create herd effect against HPV16 under moderate vaccination coverage achieved in a population-based, community-randomized trial.
Methods and findings
In 2007-2010, the 1992-1995 birth cohorts of 33 Finnish communities were randomized to receive gender-neutral HPV vaccination (Arm A), girls-only HPV vaccination (Arm B), or no HPV vaccination (Arm C) (11 communities per trial arm). HPV16/18/31/33/35/45 seroprevalence differences between the pre-vaccination era (2005-2010) and post-vaccination era (2011-2016) were compared between all 8,022 unvaccinated women <23 years old and resident in the 33 communities during 2005-2016 (2,657, 2,691, and 2,674 in Arms A, B, and C, respectively). Post- versus pre-vaccination-era HPV seroprevalence ratios (PRs) were compared by arm. Possible outcome misclassification was quantified via probabilistic bias analysis. An HPV16 and HPV18 seroprevalence reduction was observed post-vaccination in the gender-neutral vaccination arm in the entire study population (PR16 = 0.64, 95% CI 0.10-0.85; PR18 = 0.72, 95% CI 0.22-0.96) and for HPV16 also in the herpes simplex virus type 2 seropositive core group (PR16 = 0.64, 95% CI 0.50-0.81). Observed reductions in HPV31/33/35/45 seroprevalence (PR31/33/35/45 = 0.88, 95% CI 0.81-0.97) were replicated in Arm C (PR31/33/35/45 = 0.79, 95% CI 0.69-0.90).
Conclusions
In this study we only observed herd effect against HPV16/18 after gender-neutral vaccination with moderate vaccination coverage. With only moderate vaccination coverage, a gender-neutral vaccination strategy can facilitate the control of even HPV16. Our findings may have limited transportability to other vaccination coverage levels.
Trial registration
ClinicalTrials.gov number NCT00534638, https://clinicaltrials.gov/ct2/show/NCT00534638.



PLoS Med: 06 Jun 2021; 18:e1003588
Gray P, Kann H, Pimenoff VN, Eriksson T, ... Dillner J, Lehtinen M
PLoS Med: 06 Jun 2021; 18:e1003588 | PMID: 34097688
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Cervical intraepithelial neoplasia and the risk of spontaneous preterm birth: A Dutch population-based cohort study with 45,259 pregnancy outcomes.

Loopik DL, van Drongelen J, Bekkers RLM, Voorham QJM, ... van Kemenade FJ, Siebers AG
Background
Excisional procedures of cervical intraepithelial neoplasia (CIN) may increase the risk of preterm birth. It is unknown whether this increased risk is due to the excision procedure itself, to the underlying CIN, or to secondary risk factors that are associated with both preterm birth and CIN. The aim of this study is to assess the risk of spontaneous preterm birth in women with treated and untreated CIN and examine possible associations by making a distinction between the excised volume of cervical tissue and having cervical disease.
Methods and findings
This Dutch population-based observational cohort study identified women aged 29 to 41 years with CIN between 2005 and 2015 from the Dutch pathology registry (PALGA) and frequency matched them with a control group without any cervical abnormality based on age at and year of pathology outcome (i.e., CIN or normal cytology) and urbanization (<100,000 inhabitants or ≥100,000 inhabitants). All their 45,259 subsequent singleton pregnancies with a gestational age ≥16 weeks between 2010 and 2017 were identified from the Dutch perinatal database (Perined). Nineteen potential confounders for preterm birth were identified. Adjusted odds ratios (ORs) were calculated for preterm birth comparing the 3 different groups of women: (1) women without CIN diagnosis; (2) women with untreated CIN; and (3) women with treated CIN prior to each childbirth. In total, 29,907, 5,940, and 9,412 pregnancies were included in the control, untreated CIN, and treated CIN group, respectively. The control group showed a 4.8% (1,002/20,969) proportion of spontaneous preterm birth, which increased to 6.9% (271/3,940) in the untreated CIN group, 9.5% (600/6,315) in the treated CIN group, and 15.6% (50/321) in the group with multiple treatments. Women with untreated CIN had a 1.38 times greater odds of preterm birth compared to women without CIN (95% confidence interval (CI) 1.19 to 1.60; P < 0.001). For women with treated CIN, these odds 2.07 times increased compared to the control group (95% CI 1.85 to 2.33; P < 0.001). Treated women had a 1.51 times increased odds of preterm birth compared to women with untreated CIN (95% CI 1.29 to 1.76; P < 0.001). Independent from cervical disease, a volume excised from the cervix of 0.5 to 0.9 cc increased the odds of preterm birth 2.20 times (37/379 versus 1,002/20,969; 95% CI 1.52 to 3.20; P < 0.001). These odds further increased 3.13 times and 5.93 times for women with an excised volume of 4 to 8.9 cc (90/724 versus 1,002/20,969; 95% CI 2.44 to 4.01; P < 0.001) and ≥9 cc (30/139 versus 1,002/20,969; 95% CI 3.86 to 9.13; P < 0.001), respectively. Limitations of the study include the retrospective nature, lack of sufficient information to calculate odds of preterm birth <24 weeks, and that the excised volume could only be calculated for a select group of women.
Conclusions
In this study, we observed a strong correlation between preterm birth and a volume of ≥0.5 cc excised cervical tissue, regardless of the severity of CIN. Caution should be taken when performing excisional treatment in women of reproductive age as well as prudence in case of multiple biopsies. Fertile women with a history of performing multiple biopsies or excisional treatment for CIN may benefit from close surveillance during pregnancy.



PLoS Med: 03 Jun 2021; 18:e1003665
Loopik DL, van Drongelen J, Bekkers RLM, Voorham QJM, ... van Kemenade FJ, Siebers AG
PLoS Med: 03 Jun 2021; 18:e1003665 | PMID: 34086680
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Integrated treatment of hepatitis C virus infection among people who inject drugs: A multicenter randomized controlled trial (INTRO-HCV).

Fadnes LT, Aas CF, Vold JH, Leiva RA, ... Johansson KA, INTRO-HCV Study Group
Background
The standard pathways of testing and treatment for hepatitis C virus (HCV) infection in tertiary healthcare are not easily accessed by people who inject drugs (PWID). The aim of this study was to evaluate the efficacy of integrated treatment of chronic HCV infection among PWID.
Methods and findings
INTRO-HCV is a multicenter, randomized controlled clinical trial. Participants recruited from opioid agonist therapy (OAT) and community care clinics in Norway over 2017 to 2019 were randomly 1:1 assigned to the 2 treatment approaches. Integrated treatment was delivered by multidisciplinary teams at opioid agonist treatment clinics or community care centers (CCCs) for people with substance use disorders. This included on-site testing for HCV, liver fibrosis assessment, counseling, treatment, and posttreatment follow-up. Standard treatment was delivered in hospital outpatient clinics. Oral direct-acting antiviral (DAA) medications were administered in both arms. The study was not completely blinded. The primary outcomes were time-to-treatment initiation and sustained virologic response (SVR), defined as undetectable HCV RNA 12 weeks after treatment completion, analyzed with intention to treat, and presented as hazard ratio (HR) and odds ratio (OR) with 95% confidence intervals. Among 298 included participants, 150 were randomized to standard treatment, of which 116/150 (77%) initiated treatment, with 108/150 (72%) initiating within 1 year of referral. Among those 148 randomized to integrated care, 145/148 (98%) initiated treatment, with 141/148 (95%) initiating within 1 year of referral. The HR for the time to initiating treatment in the integrated arm was 2.2 (1.7 to 2.9) compared to standard treatment. SVR was confirmed in 123 (85% of initiated/83% of all) for integrated treatment compared to 96 (83% of initiated/64% of all) for the standard treatment (OR among treated: 1.5 [0.8 to 2.9], among all: 2.8 [1.6 to 4.8]). No severe adverse events were linked to the treatment.
Conclusions
Integrated treatment for HCV in PWID was superior to standard treatment in terms of time-to-treatment initiation, and subsequently, more people achieved SVR. Among those who initiated treatment, the SVR rates were comparable. Scaling up of integrated treatment models could be an important tool for elimination of HCV.
Trial registration
ClinicalTrials.gov.no NCT03155906.



PLoS Med: 31 May 2021; 18:e1003653
Fadnes LT, Aas CF, Vold JH, Leiva RA, ... Johansson KA, INTRO-HCV Study Group
PLoS Med: 31 May 2021; 18:e1003653 | PMID: 34061883
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Association between industry payments and prescriptions of long-acting insulin: An observational study with propensity score matching.

Inoue K, Tsugawa Y, Mangione CM, Duru OK
Background
The rapidly increased spending on insulin is a major public health issue in the United States. Industry marketing might be one of the upstream determinants of physicians\' prescription of long-acting insulin-the most commonly used and costly type of insulin, but the evidence is lacking. We therefore aimed to investigate the association between industry payments to physicians and subsequent prescriptions of long-acting insulin.
Methods and findings
Using the databases of Open Payments and Medicare Part D, we examined the association between the receipt of industry payments for long-acting insulin in 2016 and (1) the number of claims; (2) the costs paid for all claims; and (3) the costs per claim of long-acting insulin in 2017. We also examined the association between the receipt of payments and the change in these outcomes from 2016 to 2017. We employed propensity score matching to adjust for the physician-level characteristics (sex, years in practice, specialty, and medical school attended). Among 145,587 eligible physicians treating Medicare beneficiaries, 51,851 physicians received industry payments for long-acting insulin worth $22.3 million. In the propensity score-matched analysis including 102,590 physicians, we found that physicians who received the payments prescribed a higher number of claims (adjusted difference, 57.8; 95% CI, 55.8 to 59.7), higher costs for total claims (adjusted difference, +$22,111; 95% CI, $21,387 to $22,836), and higher costs per claim (adjusted difference, +$71.1; 95% CI, $69.0 to $73.2) of long-acting insulin, compared with physicians who did not receive the payments. The association was also found for changes in these outcomes from 2016 to 2017. Limitations to our study include limited generalizability, confounding, and possible reverse causation.
Conclusions
Industry marketing payments to physicians for long-acting insulin were associated with the physicians\' prescriptions and costs of long-acting insulin in the subsequent year. Future research is needed to assess whether policy interventions on physician-industry financial relationships will help to ensure appropriate prescriptions and limit overall costs of this essential drug for diabetes care.



PLoS Med: 31 May 2021; 18:e1003645
Inoue K, Tsugawa Y, Mangione CM, Duru OK
PLoS Med: 31 May 2021; 18:e1003645 | PMID: 34061852
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Associations of obesity and malnutrition with cardiac remodeling and cardiovascular outcomes in Asian adults: A cohort study.

Chien SC, Chandramouli C, Lo CI, Lin CF, ... Hung CL, Lam CSP
Background
Obesity, a known risk factor for cardiovascular disease and heart failure (HF), is associated with adverse cardiac remodeling in the general population. Little is known about how nutritional status modifies the relationship between obesity and outcomes. We aimed to investigate the association of obesity and nutritional status with clinical characteristics, echocardiographic changes, and clinical outcomes in the general community.
Methods and findings
We examined 5,300 consecutive asymptomatic Asian participants who were prospectively recruited in a cardiovascular health screening program (mean age 49.6 ± 11.4 years, 64.8% male) between June 2009 to December 2012. Clinical and echocardiographic characteristics were described in participants, stratified by combined subgroups of obesity and nutritional status. Obesity was indexed by body mass index (BMI) (low, ≤25 kg/m2 [lean]; high, >25 kg/m2 [obese]) (WHO-recommended Asian cutoffs). Nutritional status was defined primarily by serum albumin (SA) concentration (low, <45 g/L [malnourished]; high, ≥45 g/L [well-nourished]), and secondarily by the prognostic nutritional index (PNI) and Global Leadership Initiative on Malnutrition (GLIM) criteria. Cox proportional hazard models were used to examine a 1-year composite outcome of hospitalization for HF or all-cause mortality while adjusting for age, sex, and other clinical confounders. Our community-based cohort consisted of 2,096 (39.0%) lean-well-nourished (low BMI, high SA), 1,369 (25.8%) obese-well-nourished (high BMI, high SA), 1,154 (21.8%) lean-malnourished (low BMI, low SA), and 681 (12.8%) obese-malnourished (high BMI, low SA) individuals. Obese-malnourished participants were on average older (54.5 ± 11.4 years) and more often women (41%), with a higher mean waist circumference (91.7 ± 8.8 cm), the highest percentage of body fat (32%), and the highest prevalence of hypertension (32%), diabetes (12%), and history of cardiovascular disease (11%), compared to all other subgroups (all p < 0.001). N-terminal pro B-type natriuretic peptide (NT-proBNP) levels were substantially increased in the malnourished (versus well-nourished) groups, to a similar extent in lean (70.7 ± 177.3 versus 36.8 ± 40.4 pg/mL) and obese (73.1 ± 216.8 versus 33.2 ± 40.8 pg/mL) (p < 0.001 in both) participants. The obese-malnourished (high BMI, low SA) group also had greater left ventricular remodeling (left ventricular mass index, 44.2 ± 1.52 versus 33.8 ± 8.28 gm/m2; relative wall thickness 0.39 ± 0.05 versus 0.38 ± 0.06) and worse diastolic function (TDI-e\' 7.97 ± 2.16 versus 9.87 ± 2.47 cm/s; E/e\' 9.19 ± 3.01 versus 7.36 ± 2.31; left atrial volume index 19.5 ± 7.66 versus 14.9 ± 5.49 mL/m2) compared to the lean-well-nourished (low BMI, high SA) group, as well as all other subgroups (p < 0.001 for all). Over a median 3.6 years (interquartile range 2.5 to 4.8 years) of follow-up, the obese-malnourished group had the highest multivariable-adjusted risk of the composite outcome (hazard ratio [HR] 2.49, 95% CI 1.43 to 4.34, p = 0.001), followed by the lean-malnourished (HR 1.78, 95% CI 1.04 to 3.04, p = 0.034) and obese-well-nourished (HR 1.41, 95% CI 0.77 to 2.58, p = 0.27) groups (with lean-well-nourished group as reference). Results were similar when indexed by other anthropometric indices (waist circumference and body fat) and other measures of nutritional status (PNI and GLIM criteria). Potential selection bias and residual confounding were the main limitations of the study.
Conclusions
In our cohort study among asymptomatic community-based adults in Taiwan, we found that obese individuals with poor nutritional status have the highest comorbidity burden, the most adverse cardiac remodeling, and the least favorable composite outcome.



PLoS Med: 31 May 2021; 18:e1003661
Chien SC, Chandramouli C, Lo CI, Lin CF, ... Hung CL, Lam CSP
PLoS Med: 31 May 2021; 18:e1003661 | PMID: 34061848
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Risk of prostate cancer in relatives of prostate cancer patients in Sweden: A nationwide cohort study.

Xu X, Kharazmi E, Tian Y, Mukama T, ... Brenner H, Fallah M
Background
Evidence-based guidance for starting ages of screening for first-degree relatives (FDRs) of patients with prostate cancer (PCa) to prevent stage III/IV or fatal PCa is lacking in current PCa screening guidelines. We aimed to provide evidence for risk-adapted starting age of screening for relatives of patients with PCa.
Methods and findings
In this register-based nationwide cohort study, all men (aged 0 to 96 years at baseline) residing in Sweden who were born after 1931 along with their fathers were included. During the follow-up (1958 to 2015) of 6,343,727 men, 88,999 were diagnosed with stage III/IV PCa or died of PCa. The outcomes were defined as the diagnosis of stage III/IV PCa or death due to PCa, stratified by age at diagnosis. Using 10-year cumulative risk curves, we calculated risk-adapted starting ages of screening for men with different constellations of family history of PCa. The 10-year cumulative risk of stage III/IV or fatal PCa in men at age 50 in the general population (a common recommended starting age of screening) was 0.2%. Men with ≥2 FDRs diagnosed with PCa reached this screening level at age 41 (95% confidence interval (CI): 39 to 44), i.e., 9 years earlier, when the youngest one was diagnosed before age 60; at age 43 (41 to 47), i.e., 7 years earlier, when ≥2 FDRs were diagnosed after age 59, which was similar to that of men with 1 FDR diagnosed before age 60 (41 to 45); and at age 45 (44 to 46), when 1 FDR was diagnosed at age 60 to 69 and 47 (46 to 47), when 1 FDR was diagnosed after age 69. We also calculated risk-adapted starting ages for other benchmark screening ages, such as 45, 55, and 60 years, and compared our findings with those in the guidelines. Study limitations include the lack of genetic data, information on lifestyle, and external validation.
Conclusions
Our study provides practical information for risk-tailored starting ages of PCa screening based on nationwide cancer data with valid genealogical information. Our clinically relevant findings could be used for evidence-based personalized PCa screening guidance and supplement current PCa screening guidelines for relatives of patients with PCa.



PLoS Med: 30 May 2021; 18:e1003616
Xu X, Kharazmi E, Tian Y, Mukama T, ... Brenner H, Fallah M
PLoS Med: 30 May 2021; 18:e1003616 | PMID: 34061847
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Inequities in access to primary care among opioid recipients in Ontario, Canada: A population-based cohort study.

Gomes T, Campbell TJ, Martins D, Paterson JM, ... Mamdani M, Glazier RH
Background
Stigma and high-care needs can present barriers to the provision of high-quality primary care for people with opioid use disorder (OUD) and those prescribed opioids for chronic pain. We explored the likelihood of securing a new primary care provider (PCP) among people with varying histories of opioid use who had recently lost access to their PCP.
Methods and findings
We conducted a retrospective cohort study using linked administrative data among residents of Ontario, Canada whose enrolment with a physician practicing in a primary care enrolment model (PEM) was terminated between January 2016 and December 2017. We assigned individuals to 3 groups based upon their opioid use on the date enrolment ended: long-term opioid pain therapy (OPT), opioid agonist therapy (OAT), or no opioid. We fit multivariable models assessing the primary outcome of primary care reattachment within 1 year, adjusting for demographic characteristics, clinical comorbidities, and health services utilization. Secondary outcomes included rates of emergency department (ED) visits and opioid toxicity events. Among 154,970 Ontarians who lost their PCP, 1,727 (1.1%) were OAT recipients, 3,644 (2.4%) were receiving long-term OPT, and 149,599 (96.5%) had no recent prescription opioid exposure. In general, OAT recipients were younger (median age 36) than those receiving long-term OPT (59 years) and those with no recent prescription opioid exposure (44 years). In all exposure groups, the majority of individuals had their enrolment terminated by their physician (range 78.1% to 88.8%). In the primary analysis, as compared to those not receiving opioids, OAT recipients were significantly less likely to find a PCP within 1 year (adjusted hazard ratio [aHR] 0.55, 95% confidence interval [CI] 0.50 to 0.61, p < 0.0001). We observed no significant difference between long-term OPT and opioid unexposed individuals (aHR 0.96; 95% CI 0.92 to 1.01, p = 0.12). In our secondary analysis comparing the period of PCP loss to the year prior, we found that rates of ED visits were elevated among people not receiving opioids (adjusted rate ratio (aRR) 1.20, 95% CI 1.18 to 1.22, p < 0.0001) and people receiving long-term OPT (aRR 1.37, 95% CI 1.28 to 1.48, p < 0.0001). We found no such increase among OAT recipients, and no significant increase in opioid toxicity events in the period following provider loss for any exposure group. The main limitation of our findings relates to their generalizability outside of PEMs and in jurisdictions with different financial incentives incorporated into primary care provision.
Conclusions
In this study, we observed gaps in access to primary care among people who receive prescription opioids, particularly among OAT recipients. Ongoing efforts are needed to address the stigma, discrimination, and financial disincentives that may introduce barriers to the healthcare system, and to facilitate access to high-quality, consistent primary care services for chronic pain patients and those with OUD.



PLoS Med: 30 May 2021; 18:e1003631
Gomes T, Campbell TJ, Martins D, Paterson JM, ... Mamdani M, Glazier RH
PLoS Med: 30 May 2021; 18:e1003631 | PMID: 34061846
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Vitamin D and COVID-19 susceptibility and severity in the COVID-19 Host Genetics Initiative: A Mendelian randomization study.

Butler-Laporte G, Nakanishi T, Mooser V, Morrison DR, ... Forgetta V, Richards JB
Background
Increased vitamin D levels, as reflected by 25-hydroxy vitamin D (25OHD) measurements, have been proposed to protect against COVID-19 based on in vitro, observational, and ecological studies. However, vitamin D levels are associated with many confounding variables, and thus associations described to date may not be causal. Vitamin D Mendelian randomization (MR) studies have provided results that are concordant with large-scale vitamin D randomized trials. Here, we used 2-sample MR to assess evidence supporting a causal effect of circulating 25OHD levels on COVID-19 susceptibility and severity.
Methods and findings
Genetic variants strongly associated with 25OHD levels in a genome-wide association study (GWAS) of 443,734 participants of European ancestry (including 401,460 from the UK Biobank) were used as instrumental variables. GWASs of COVID-19 susceptibility, hospitalization, and severe disease from the COVID-19 Host Genetics Initiative were used as outcome GWASs. These included up to 14,134 individuals with COVID-19, and up to 1,284,876 without COVID-19, from up to 11 countries. SARS-CoV-2 positivity was determined by laboratory testing or medical chart review. Population controls without COVID-19 were also included in the control groups for all outcomes, including hospitalization and severe disease. Analyses were restricted to individuals of European descent when possible. Using inverse-weighted MR, genetically increased 25OHD levels by 1 standard deviation on the logarithmic scale had no significant association with COVID-19 susceptibility (odds ratio [OR] = 0.95; 95% CI 0.84, 1.08; p = 0.44), hospitalization (OR = 1.09; 95% CI: 0.89, 1.33; p = 0.41), and severe disease (OR = 0.97; 95% CI: 0.77, 1.22; p = 0.77). We used an additional 6 meta-analytic methods, as well as conducting sensitivity analyses after removal of variants at risk of horizontal pleiotropy, and obtained similar results. These results may be limited by weak instrument bias in some analyses. Further, our results do not apply to individuals with vitamin D deficiency.
Conclusions
In this 2-sample MR study, we did not observe evidence to support an association between 25OHD levels and COVID-19 susceptibility, severity, or hospitalization. Hence, vitamin D supplementation as a means of protecting against worsened COVID-19 outcomes is not supported by genetic evidence. Other therapeutic or preventative avenues should be given higher priority for COVID-19 randomized controlled trials.



PLoS Med: 30 May 2021; 18:e1003605
Butler-Laporte G, Nakanishi T, Mooser V, Morrison DR, ... Forgetta V, Richards JB
PLoS Med: 30 May 2021; 18:e1003605 | PMID: 34061844
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Global economic costs due to vivax malaria and the potential impact of its radical cure: A modelling study.

Devine A, Battle KE, Meagher N, Howes RE, ... Price RN, Lubell Y
Background
In 2017, an estimated 14 million cases of Plasmodium vivax malaria were reported from Asia, Central and South America, and the Horn of Africa. The clinical burden of vivax malaria is largely driven by its ability to form dormant liver stages (hypnozoites) that can reactivate to cause recurrent episodes of malaria. Elimination of both the blood and liver stages of the parasites (\"radical cure\") is required to achieve a sustained clinical response and prevent ongoing transmission of the parasite. Novel treatment options and point-of-care diagnostics are now available to ensure that radical cure can be administered safely and effectively. We quantified the global economic cost of vivax malaria and estimated the potential cost benefit of a policy of radical cure after testing patients for glucose-6-phosphate dehydrogenase (G6PD) deficiency.
Methods and findings
Estimates of the healthcare provider and household costs due to vivax malaria were collated and combined with national case estimates for 44 endemic countries in 2017. These provider and household costs were compared with those that would be incurred under 2 scenarios for radical cure following G6PD screening: (1) complete adherence following daily supervised primaquine therapy and (2) unsupervised treatment with an assumed 40% effectiveness. A probabilistic sensitivity analysis generated credible intervals (CrIs) for the estimates. Globally, the annual cost of vivax malaria was US$359 million (95% CrI: US$222 to 563 million), attributable to 14.2 million cases of vivax malaria in 2017. From a societal perspective, adopting a policy of G6PD deficiency screening and supervision of primaquine to all eligible patients would prevent 6.1 million cases and reduce the global cost of vivax malaria to US$266 million (95% CrI: US$161 to 415 million), although healthcare provider costs would increase by US$39 million. If perfect adherence could be achieved with a single visit, then the global cost would fall further to US$225 million, equivalent to $135 million in cost savings from the baseline global costs. A policy of unsupervised primaquine reduced the cost to US$342 million (95% CrI: US$209 to 532 million) while preventing 2.1 million cases. Limitations of the study include partial availability of country-level cost data and parameter uncertainty for the proportion of patients prescribed primaquine, patient adherence to a full course of primaquine, and effectiveness of primaquine when unsupervised.
Conclusions
Our modelling study highlights a substantial global economic burden of vivax malaria that could be reduced through investment in safe and effective radical cure achieved by routine screening for G6PD deficiency and supervision of treatment. Novel, low-cost interventions for improving adherence to primaquine to ensure effective radical cure and widespread access to screening for G6PD deficiency will be critical to achieving the timely global elimination of P. vivax.



PLoS Med: 30 May 2021; 18:e1003614
Devine A, Battle KE, Meagher N, Howes RE, ... Price RN, Lubell Y
PLoS Med: 30 May 2021; 18:e1003614 | PMID: 34061843
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Blood pressure-lowering treatment for the prevention of cardiovascular events in patients with atrial fibrillation: An individual participant data meta-analysis.

Pinho-Gomes AC, Azevedo L, Copland E, Canoy D, ... Rahimi K, Blood Pressure Lowering Treatment Trialists’ Collaboration
Background
Randomised evidence on the efficacy of blood pressure (BP)-lowering treatment to reduce cardiovascular risk in patients with atrial fibrillation (AF) is limited. Therefore, this study aimed to compare the effects of BP-lowering drugs in patients with and without AF at baseline.
Methods and findings
The study was based on the resource provided by the Blood Pressure Lowering Treatment Trialists\' Collaboration (BPLTTC), in which individual participant data (IPD) were extracted from trials with over 1,000 patient-years of follow-up in each arm, and that had randomly assigned patients to different classes of BP-lowering drugs, BP-lowering drugs versus placebo, or more versus less intensive BP-lowering regimens. For this study, only trials that had collected information on AF status at baseline were included. The effects of BP-lowering treatment on a composite endpoint of major cardiovascular events (stroke, ischaemic heart disease or heart failure) according to AF status at baseline were estimated using fixed-effect one-stage IPD meta-analyses based on Cox proportional hazards models stratified by trial. Furthermore, to assess whether the associations between the intensity of BP reduction and cardiovascular outcomes are similar in those with and without AF at baseline, we used a meta-regression. From the full BPLTTC database, 28 trials (145,653 participants) were excluded because AF status at baseline was uncertain or unavailable. A total of 22 trials were included with 188,570 patients, of whom 13,266 (7%) had AF at baseline. Risk of bias assessment showed that 20 trials were at low risk of bias and 2 trials at moderate risk. Meta-regression showed that relative risk reductions were proportional to trial-level intensity of BP lowering in patients with and without AF at baseline. Over 4.5 years of median follow-up, a 5-mm Hg systolic BP (SBP) reduction lowered the risk of major cardiovascular events both in patients with AF (hazard ratio [HR] 0.91, 95% confidence interval [CI] 0.83 to 1.00) and in patients without AF at baseline (HR 0.91, 95% CI 0.88 to 0.93), with no difference between subgroups. There was no evidence for heterogeneity of treatment effects by baseline SBP or drug class in patients with AF at baseline. The findings of this study need to be interpreted in light of its potential limitations, such as the limited number of trials, limitation in ascertaining AF cases due to the nature of the arrhythmia and measuring BP in patients with AF.
Conclusions
In this meta-analysis, we found that BP-lowering treatment reduces the risk of major cardiovascular events similarly in individuals with and without AF. Pharmacological BP lowering for prevention of cardiovascular events should be recommended in patients with AF.



PLoS Med: 30 May 2021; 18:e1003599
Pinho-Gomes AC, Azevedo L, Copland E, Canoy D, ... Rahimi K, Blood Pressure Lowering Treatment Trialists’ Collaboration
PLoS Med: 30 May 2021; 18:e1003599 | PMID: 34061831
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Association of APOE ε4 genotype and lifestyle with cognitive function among Chinese adults aged 80 years and older: A cross-sectional study.

Jin X, He W, Zhang Y, Gong E, ... Zeng Y, Yan LL
Background
Apolipoprotein E (APOE) ε4 is the single most important genetic risk factor for cognitive impairment and Alzheimer disease (AD), while lifestyle factors such as smoking, drinking, diet, and physical activity also have impact on cognition. The goal of the study is to investigate whether the association between lifestyle and cognition varies by APOE genotype among the oldest old.
Methods and findings
We used the cross-sectional data including 6,160 oldest old (aged 80 years old or older) from the genetic substudy of the Chinese Longitudinal Healthy Longevity Survey (CLHLS) which is a national wide cohort study that began in 1998 with follow-up surveys every 2-3 years. Cognitive impairment was defined as a Mini-Mental State Examination (MMSE) score less than 18. Healthy lifestyle profile was classified into 3 groups by a composite measure including smoking, alcohol consumption, dietary pattern, physical activity, and body weight. APOE genotype was categorized as APOE ε4 carriers versus noncarriers. We examined the associations of cognitive impairment with lifestyle profile and APOE genotype using multivariable logistic regressions, controlling for age, sex, education, marital status, residence, disability, and numbers of chronic conditions. The mean age of our study sample was 90.1 (standard deviation [SD], 7.2) years (range 80-113); 57.6% were women, and 17.5% were APOE ε4 carriers. The mean MMSE score was 21.4 (SD: 9.2), and 25.0% had cognitive impairment. Compared with those with an unhealthy lifestyle, participants with intermediate and healthy lifestyle profiles were associated with 28% (95% confidence interval [CI]: 16%-38%, P < 0.001) and 55% (95% CI: 44%-64%, P < 0.001) lower adjusted odds of cognitive impairment. Carrying the APOE ε4 allele was associated with 17% higher odds (95% CI: 1%-31%, P = 0.042) of being cognitively impaired in the adjusted model. The association between lifestyle profiles and cognitive function did not vary significantly by APOE ε4 genotype (noncarriers: 0.47 [0.37-0.60] healthy versus unhealthy; carriers: 0.33 [0.18-0.58], P for interaction = 0.30). The main limitation was the lifestyle measurements were self-reported and were nonspecific. Generalizability of the findings is another limitation because the study sample was from the oldest old in China, with unique characteristics such as low body weight compared to populations in high-income countries.
Conclusions
In this study, we observed that healthier lifestyle was associated with better cognitive function among the oldest old regardless of APOE genotype. Our findings may inform the cognitive outlook for those oldest old with high genetic risk of cognitive impairment.



PLoS Med: 30 May 2021; 18:e1003597
Jin X, He W, Zhang Y, Gong E, ... Zeng Y, Yan LL
PLoS Med: 30 May 2021; 18:e1003597 | PMID: 34061824
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Pyronaridine-artesunate real-world safety, tolerability, and effectiveness in malaria patients in 5 African countries: A single-arm, open-label, cohort event monitoring study.

Tona Lutete G, Mombo-Ngoma G, Assi SB, Bigoga JD, ... Ramharter M, CANTAM study group
Background
In Phase II/III randomized controlled clinical trials for the treatment of acute uncomplicated malaria, pyronaridine-artesunate demonstrated high efficacy and a safety profile consistent with that of comparators, except that asymptomatic, mainly mild-to-moderate transient increases in liver aminotransferases were reported for some patients. Hepatic safety, tolerability, and effectiveness have not been previously assessed under real-world conditions in Africa.
Methods and findings
This single-arm, open-label, cohort event monitoring study was conducted at 6 health centers in Cameroon, Democratic Republic of Congo, Gabon, Ivory Coast, and Republic of Congo between June 2017 and April 2019. The trial protocol as closely as possible resembled real-world clinical practice for the treatment of malaria at the centers. Eligible patients were adults or children of either sex, weighing at least 5 kg, with acute uncomplicated malaria who did not have contraindications for pyronaridine-artesunate treatment as per the summary of product characteristics. Patients received fixed-dose pyronaridine-artesunate once daily for 3 days, dosed by body weight, without regard to food intake. A tablet formulation was used in adults and adolescents and a pediatric granule formulation in children and infants under 20 kg body weight. The primary outcome was the hepatic event incidence, defined as the appearance of the clinical signs and symptoms of hepatotoxicity confirmed by a >2× rise in alanine aminotransferase/aspartate aminotransferase (ALT/AST) versus baseline in patients with baseline ALT/AST >2× the upper limit of normal (ULN). As a secondary outcome, this was assessed in patients with ALT/AST >2× ULN prior to treatment versus a matched cohort of patients with normal baseline ALT/AST. The safety population comprised 7,154 patients, of mean age 13.9 years (standard deviation (SD) 14.6), around half of whom were male (3,569 [49.9%]). Patients experienced 8,560 malaria episodes; 158 occurred in patients with baseline ALT/AST elevations >2×ULN. No protocol-defined hepatic events occurred following pyronaridine-artesunate treatment of malaria patients with or without baseline hepatic dysfunction. Thus, no cohort comparison could be undertaken. Also, as postbaseline clinical chemistry was only performed where clinically indicated, postbaseline ALT/AST levels were not systematically assessed for all patients. Adverse events of any cause occurred in 20.8% (1,490/7,154) of patients, most frequently pyrexia (5.1% [366/7,154]) and vomiting (4.2% [303/7,154]). Adjusting for Plasmodium falciparum reinfection, clinical effectiveness at day 28 was 98.6% ([7,369/7,746] 95% confidence interval (CI) 98.3 to 98.9) in the per-protocol population. There was no indication that comorbidities or malnutrition adversely affected outcomes. The key study limitation was that postbaseline clinical biochemistry was only evaluated when clinically indicated.
Conclusions
Pyronaridine-artesunate had good tolerability and effectiveness in a representative African population under conditions similar to everyday clinical practice. These findings support pyronaridine-artesunate as an operationally useful addition to the management of acute uncomplicated malaria.
Trial registration
ClinicalTrials.gov NCT03201770.



PLoS Med: 30 May 2021; 18:e1003669
Tona Lutete G, Mombo-Ngoma G, Assi SB, Bigoga JD, ... Ramharter M, CANTAM study group
PLoS Med: 30 May 2021; 18:e1003669 | PMID: 34129601
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Effects of community-based antiretroviral therapy initiation models on HIV treatment outcomes: A systematic review and meta-analysis.

Eshun-Wilson I, Awotiwon AA, Germann A, Amankwaa SA, ... Baral S, Geng EH
Background
Antiretroviral therapy (ART) initiation in the community and outside of a traditional health facility has the potential to improve linkage to ART, decongest health facilities, and minimize structural barriers to attending HIV services among people living with HIV (PLWH). We conducted a systematic review and meta-analysis to determine the effect of offering ART initiation in the community on HIV treatment outcomes.
Methods and findings
We searched databases between 1 January 2013 and 22 February 2021 to identify randomized controlled trials (RCTs) and observational studies that compared offering ART initiation in a community setting to offering ART initiation in a traditional health facility or alternative community setting. We assessed risk of bias, reporting of implementation outcomes, and real-world relevance and used Mantel-Haenszel methods to generate pooled risk ratios (RRs) and risk differences (RDs) with 95% confidence intervals. We evaluated heterogeneity qualitatively and quantitatively and used GRADE to evaluate overall evidence certainty. Searches yielded 4,035 records, resulting in 8 included studies-4 RCTs and 4 observational studies-conducted in Lesotho, South Africa, Nigeria, Uganda, Malawi, Tanzania, and Haiti-a total of 11,196 PLWH. Five studies were conducted in general HIV populations, 2 in key populations, and 1 in adolescents. Community ART initiation strategies included community-based HIV testing coupled with ART initiation at home or at community venues; 5 studies maintained ART refills in the community, and 4 provided refills at the health facility. All studies were pragmatic, but in most cases provided additional resources. Few studies reported on implementation outcomes. All studies showed higher ART uptake in community initiation arms compared to facility initiation and refill arms (standard of care) (RR 1.73, 95% CI 1.22 to 2.45; RD 30%, 95% CI 10% to 50%; 5 studies). Retention (RR 1.43, 95% CI 1.32 to 1.54; RD 19%, 95% CI 11% to 28%; 4 studies) and viral suppression (RR 1.31, 95% CI 1.15 to 1.49; RD 15%, 95% CI 10% to 21%; 3 studies) at 12 months were also higher in the community-based ART initiation arms. Improved uptake, retention, and viral suppression with community ART initiation were seen across population subgroups-including men, adolescents, and key populations. One study reported no difference in retention and viral suppression at 2 years. There were limited data on adherence and mortality. Social harms and adverse events appeared to be minimal and similar between community care and standard of care. One study compared ART refill strategies following community ART initiation (community versus facility refills), and found no difference in viral suppression (RD -7%, 95% CI -19% to 6%) or retention at 12 months (RD -12%, 95% CI -23% to 0.3%). This systematic review was limited by there being overall few studies for inclusion, poor-quality observational data, and short-term outcomes.
Conclusions
Based on data from a limited set of studies, community ART initiation appears to result in higher ART uptake, retention, and viral suppression at 1 year compared to facility-based ART initiation. Implementation on a wider scale necessitates broader exploration of costs, logistics, and acceptability by providers and PLWH to ensure that these effects are reproducible when delivered at scale, in different contexts, and over time.



PLoS Med: 27 May 2021; 18:e1003646
Eshun-Wilson I, Awotiwon AA, Germann A, Amankwaa SA, ... Baral S, Geng EH
PLoS Med: 27 May 2021; 18:e1003646 | PMID: 34048443
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

SMS messaging to improve retention and viral suppression in prevention of mother-to-child HIV transmission (PMTCT) programs in Kenya: A 3-arm randomized clinical trial.

Kinuthia J, Ronen K, Unger JA, Jiang W, ... Richardson BA, John-Stewart G
Background
Pregnant and postpartum women living with HIV (WLWH) need support for HIV and maternal child health (MCH) care, which could be provided using short message service (SMS).
Methods and findings
We compared 2-way (interactive) and 1-way SMS messaging to no SMS in a 3-arm randomized trial in 6 MCH clinics in Kenya. Messages were developed using the Health Belief Model and Social Cognitive Theory; HIV messages were integrated into an existing MCH SMS platform. Intervention participants received visit reminder and prespecified weekly SMS on antiretroviral therapy (ART) adherence and MCH, tailored to their characteristics and timing. Two-way participants could message nurses as needed. Clinic attendance, viral load (VL), and infant HIV results were abstracted from program records. Primary outcomes were viral nonsuppression (VL ≥1,000 c/ml), on-time clinic attendance, loss to follow-up from clinical care, and infant HIV-free survival. Among 824 pregnant women randomized between November 2015 and May 2017, median age was 27 years, gestational age was 24.3 weeks, and time since initiation of ART was 1.0 year. During follow-up to 2 years postpartum, 9.8% of 3,150 VL assessments and 19.6% of women were ever nonsuppressed, with no significant difference in 1-way versus control (11.2% versus 9.6%, adjusted risk ratio (aRR) 1.02 [95% confidence interval (CI) 0.67 to 1.54], p = 0.94) or 2-way versus control (8.5% versus 9.6%, aRR 0.80 [95% CI 0.52 to 1.23], p = 0.31). Median ART adherence and incident ART resistance did not significantly differ by arm. Overall, 88.9% (95% CI 76.5 to 95.7) of visits were on time, with no significant differences between arms (88.2% in control versus 88.6% in 1-way and 88.8% in 2-way). Incidence of infant HIV or death was 3.01/100 person-years (py), with no significant difference between arms; risk of infant HIV infection was 0.94%. Time to postpartum contraception was significantly shorter in the 2-way arm than control. Study limitations include limited ability to detect improvement due to high viral suppression and visit attendance and imperfect synchronization of SMS reminders to clinic visits.
Conclusions
Integrated HIV/MCH messaging did not improve HIV outcomes but was associated with improved initiation of postpartum contraception. In programs where most women are virally suppressed, targeted SMS informed by VL data may improve effectiveness. Rigorous evaluation remains important to optimize mobile health (mHealth) interventions.
Trial registration
ClinicalTrials.gov number NCT02400671.



PLoS Med: 23 May 2021; 18:e1003650
Kinuthia J, Ronen K, Unger JA, Jiang W, ... Richardson BA, John-Stewart G
PLoS Med: 23 May 2021; 18:e1003650 | PMID: 34029338
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

The sugar content of foods in the UK by category and company: A repeated cross-sectional study, 2015-2018.

Bandy LK, Scarborough P, Harrington RA, Rayner M, Jebb SA
Background
Consumption of free sugars in the UK greatly exceeds dietary recommendations. Public Health England (PHE) has set voluntary targets for industry to reduce the sales-weighted mean sugar content of key food categories contributing to sugar intake by 5% by 2018 and 20% by 2020. The aim of this study was to assess changes in the sales-weighted mean sugar content and total volume sales of sugar in selected food categories among UK companies between 2015 and 2018.
Methods and findings
We used sales data from Euromonitor, which estimates total annual retail sales of packaged foods, for 5 categories-biscuits and cereal bars, breakfast cereals, chocolate confectionery, sugar confectionery, and yoghurts-for 4 consecutive years (2015-2018). This analysis includes 353 brands (groups of products with the same name) sold by 99 different companies. These data were linked with nutrient composition data collected online from supermarket websites over 2015-2018 by Edge by Ascential. The main outcome measures were sales volume, sales-weighted mean sugar content, and total volume of sugar sold by category and company. Our results show that between 2015 and 2018 the sales-weighted mean sugar content of all included foods fell by 5.2% (95% CI -9.4%, -1.4%), from 28.7 g/100 g (95% CI 27.2, 30.4) to 27.2 g/100 g (95% CI 25.8, 28.4). The greatest change seen was in yoghurts (-17.0% [95% CI -26.8%, -7.1%]) and breakfast cereals (-13.3% [95% CI -19.2%, -7.4%]), with only small reductions in sugar confectionery (-2.4% [95% CI -4.2%, -0.6%]) and chocolate confectionery (-1.0% [95% CI -3.1, 1.2]). Our results show that total volume of sugars sold per capita fell from 21.4 g/d (95% CI 20.3, 22.7) to 19.7 g/d (95% CI 18.8, 20.7), a reduction of 7.5% (95% CI -13.1%, -2.8%). Of the 50 companies representing the top 10 companies in each category, 24 met the 5% reduction target set by PHE for 2018. The key limitations of this study are that it does not encompass the whole food market and is limited by its use of brand-level sales data, rather than individual product sales data.
Conclusions
Our findings show there has been a small reduction in total volume sales of sugar in the included categories, primarily due to reductions in the sugar content of yoghurts and breakfast cereals. Additional policy measures may be needed to accelerate progress in categories such as sugar confectionery and chocolate confectionery if the 2020 PHE voluntary sugar reduction targets are to be met.



PLoS Med: 17 May 2021; 18:e1003647
Bandy LK, Scarborough P, Harrington RA, Rayner M, Jebb SA
PLoS Med: 17 May 2021; 18:e1003647 | PMID: 34003863
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Taxed and untaxed beverage intake by South African young adults after a national sugar-sweetened beverage tax: A before-and-after study.

Essman M, Taillie LS, Frank T, Ng SW, Popkin BM, Swart EC
Background
In an effort to prevent and reduce the prevalence rate of people with obesity and diabetes, South Africa implemented a sugar-content-based tax called the Health Promotion Levy in April 2018, one of the first sugar-sweetened beverage (SSB) taxes to be based on each gram of sugar (beyond 4 g/100 ml). This before-and-after study estimated changes in taxed and untaxed beverage intake 1 year after the tax, examining separately, to our knowledge for the first time, the role of reformulation distinct from behavioral changes in SSB intake.
Methods and findings
We collected single-day 24-hour dietary recalls from repeat cross-sectional surveys of adults aged 18-39 years in Langa, South Africa. Participants were recruited in February-March 2018 (pre-tax, n = 2,459) and February-March 2019 (post-tax, n = 2,489) using door-to-door sampling. We developed time-specific food composition tables (FCTs) for South African beverages before and after the tax, linked with the diet recalls. By linking pre-tax FCTs only to dietary intake data collected in the pre-tax and post-tax periods, we calculated changes in beverage intake due to behavioral change, assuming no reformulation. Next, we repeated the analysis using an updated FCT in the post-tax period to capture the marginal effect of reformulation. We estimated beverage intake using a 2-part model that takes into consideration the biases in using ordinary least squares or other continuous variable approaches with many individuals with zero intake. First, a probit model was used to estimate the probability of consuming the specific beverage category. Then, conditional on a positive outcome, a generalized linear model with a log-link was used to estimate the continuous amount of beverage consumed. Among taxed beverages, sugar intake decreased significantly (p < 0.0001) from 28.8 g/capita/day (95% CI 27.3-30.4) pre-tax to 19.8 (95% CI 18.5-21.1) post-tax. Energy intake decreased (p < 0.0001) from 121 kcal/capita/day (95% CI 114-127) pre-tax to 82 (95% CI 76-87) post-tax. Volume intake decreased (p < 0.0001) from 315 ml/capita/day (95% CI 297-332) pre-tax to 198 (95% CI 185-211) post-tax. Among untaxed beverages, sugar intake increased (p < 0.0001) by 5.3 g/capita/day (95% CI 3.7 to 6.9), and energy intake increased (p < 0.0001) by 29 kcal/capita/day (95% CI 19 to 39). Among total beverages, sugar intake decreased significantly (p = 0.004) by 3.7 (95% CI -6.2 to -1.2) g/capita/day. Behavioral change accounted for reductions of 24% in energy, 22% in sugar, and 23% in volume, while reformulation accounted for additional reductions of 8% in energy, 9% in sugar, and 14% in volume from taxed beverages. The key limitations of this study are an inability to make causal claims due to repeat cross-sectional data collection, and that the magnitude of reduction in taxed beverage intake may not be generalizable to higher income populations.
Conclusions
Using a large sample of a high-consuming, low-income population, we found large reductions in taxed beverage intake, separating the components of behavioral change from reformulation. This reduction was partially compensated by an increase in sugar and energy from untaxed beverages. Because policies such as taxes can incentivize reformulation, our use of an up-to-date FCT that reflects a rapidly changing food supply is novel and important for evaluating policy effects on intake.



PLoS Med: 29 Apr 2021; 18:e1003574
Essman M, Taillie LS, Frank T, Ng SW, Popkin BM, Swart EC
PLoS Med: 29 Apr 2021; 18:e1003574 | PMID: 34032809
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Sexual and reproductive health information and referrals for resettled refugee women: A survey of resettlement agencies in the United States.

Katcher T, Thimmesch R, Spitz A, Kulkarni L, ... Weiner A, Woodford Martin M
Background
Refugee resettlement offices are the first point of contact for newly arrived refugees and play a significant role in helping refugees acclimate and settle into life in the United States. Available literature suggests that refugee women are vulnerable to poor sexual and reproductive health (SRH) outcomes, including sexually transmitted infections and HIV infections as well as adverse pregnancy outcomes, but little is known about the role that refugee resettlement offices play in supporting refugee women\'s SRH. This study examines the capacity and interest of resettlement offices in providing SRH information and referrals to newly arrived refugees.
Methods and findings
The research team conducted an online survey of staff members at refugee resettlement offices throughout the US in 2018 to determine (1) available SRH resources and workshops; (2) referrals to and assistance with making appointments for SRH and primary care appointments; (3) barriers to addressing SRH needs of clients; and (4) interest in building the capacity of office staff to address SRH issues. The survey was created for this study and had not been previously used or validated. Survey data underwent descriptive analysis. A total of 236 resettlement offices were contacted, with responses from 100 offices, for a total response rate of 42%. Fifteen percent (N = 15) of refugee resettlement agencies (RRAs) who responded to the survey provide materials about SRH to clients, and 49% (N = 49) incorporate sexual health into the classes they provide to newly arrived refugee clients. Moreover, 12% (N = 12) of responding RRAs screen clients for pregnancy intention, and 20% (N = 20) directly refer to contraceptive care and services. This study is limited by the response rate of the survey; no conclusions can be drawn about those offices that did not respond. In addition, the survey instrument was not validated against any other sources of information about the practices of refugee resettlement offices.
Conclusions
In this study, we observed that many resettlement offices do not routinely provide information or referrals for SRH needs. Responding offices cite lack of time and competing priorities as major barriers to providing SRH education and referrals to clients.



PLoS Med: 29 Apr 2021; 18:e1003579
Katcher T, Thimmesch R, Spitz A, Kulkarni L, ... Weiner A, Woodford Martin M
PLoS Med: 29 Apr 2021; 18:e1003579 | PMID: 33939705
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Tranexamic acid and bleeding in patients treated with non-vitamin K oral anticoagulants undergoing dental extraction: The EXTRACT-NOAC randomized clinical trial.

Ockerman A, Miclotte I, Vanhaverbeke M, Vanassche T, ... Politis C, Verhamme P
Background
Oral bleeding after dental extraction in patients on non-vitamin K oral anticoagulants (NOACs) is a frequent problem. We investigated whether 10% tranexamic acid (TXA) mouthwash decreases post-extraction bleeding in patients treated with NOACs.
Methods and findings
The EXTRACT-NOAC study is a randomized, double-blind, placebo-controlled, multicenter, clinical trial. Patients were randomly assigned to 10% TXA or placebo mouthwash and were instructed to use the mouthwash once prior to dental extraction, and thereafter for 3 times a day for 3 days. The primary outcome was the number of patients with any post-extraction oral bleeding up to day 7. Secondary outcomes included periprocedural, early, and delayed bleeding, and the safety outcomes included all thrombotic events. The first patient was randomized on February 9, 2018 and the last patient on March 12, 2020. Of 222 randomized patients, 218 patients were included in the full analysis set, of which 106 patients were assigned to TXA (74.8 (±8.8) years; 81 men) and 112 to placebo (72.7 (±10.7) years; 64 men). Post-extraction bleeding occurred in 28 (26.4%) patients in the TXA group and in 32 (28.6%) patients in the placebo group (relative risk, 0.92; 95% confidence interval [CI], 0.60 to 1.42; P = 0.72). There were 46 bleeds in the TXA group and 85 bleeds in the placebo group (rate ratio, 0.57; 95% CI, 0.31 to 1.05; P = 0.07). TXA did not reduce the rate of periprocedural bleeding (bleeding score 4 ± 1.78 versus 4 ± 1.82, P = 0.80) and early bleeding (rate ratio, 0.76; 95% CI, 0.42 to 1.37). Delayed bleeding (rate ratio, 0.32; 95% CI, 0.12 to 0.89) and bleeding after multiple extractions (rate ratio, 0.40; 95% CI, 0.20 to 0.78) were lower in the TXA group. One patient in the placebo group had a transient ischemic attack while interrupting the NOAC therapy in preparation for the dental extraction. Two of the study limitations were the premature interruption of the trial following a futility analysis and the assessment of the patients\' compliance that was based on self-reported information during follow-up.
Conclusions
In patients on NOACs undergoing dental extraction, TXA does not seem to reduce the rate of periprocedural or early postoperative oral bleeding compared to placebo. TXA appears to reduce delayed bleeds and postoperative oral bleeding if multiple teeth are extracted.
Trial registration
ClinicalTrials.gov NCT03413891 EudraCT; EudraCT number:2017-001426-17; EudraCT Public website: eudract.ema.europa.eu.



PLoS Med: 29 Apr 2021; 18:e1003601
Ockerman A, Miclotte I, Vanhaverbeke M, Vanassche T, ... Politis C, Verhamme P
PLoS Med: 29 Apr 2021; 18:e1003601 | PMID: 33939696
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Circulating tumor DNA dynamics and recurrence risk in patients undergoing curative intent resection of colorectal cancer liver metastases: A prospective cohort study.

Tie J, Wang Y, Cohen J, Li L, ... Vogelstein B, Gibbs P
Background
In patients with resectable colorectal liver metastases (CRLM), the role of pre- and postoperative systemic therapy continues to be debated. Previous studies have shown that circulating tumor DNA (ctDNA) analysis, as a marker of minimal residual disease, is a powerful prognostic factor in patients with nonmetastatic colorectal cancer (CRC). Serial analysis of ctDNA in patients with resectable CRLM could inform the optimal use of perioperative chemotherapy. Here, we performed a validation study to confirm the prognostic impact of postoperative ctDNA in resectable CRLM observed in a previous discovery study.
Methods and findings
We prospectively collected plasma samples from patients with resectable CRLM, including presurgical and postsurgical samples, serial samples during any pre- or postoperative chemotherapy, and serial samples in follow-up. Via targeted sequencing of 15 genes commonly mutated in CRC, we identified at least 1 somatic mutation in each patient\'s tumor. We then designed a personalized assay to assess 1 mutation in plasma samples using the Safe-SeqS assay. A total of 380 plasma samples from 54 patients recruited from July 2011 to Dec 2014 were included in our analysis. Twenty-three (43%) patients received neoadjuvant chemotherapy, and 42 patients (78%) received adjuvant chemotherapy after surgery. Median follow-up was 51 months (interquartile range, 31 to 60 months). At least 1 somatic mutation was identified in all patients\' tumor tissue. ctDNA was detectable in 46/54 (85%) patients prior to any treatment and 12/49 (24%) patients after surgery. There was a median 40.93-fold (19.10 to 87.73, P < 0.001) decrease in ctDNA mutant allele fraction with neoadjuvant chemotherapy, but ctDNA clearance during neoadjuvant chemotherapy was not associated with a better recurrence-free survival (RFS). Patients with detectable postoperative ctDNA experienced a significantly lower RFS (HR 6.3; 95% CI 2.58 to 15.2; P < 0.001) and overall survival (HR 4.2; 95% CI 1.5 to 11.8; P < 0.001) compared to patients with undetectable ctDNA. For the 11 patients with detectable postoperative ctDNA who had serial ctDNA sampling during adjuvant chemotherapy, ctDNA clearance was observed in 3 patients, 2 of whom remained disease-free. All 8 patients with persistently detectable ctDNA after adjuvant chemotherapy have recurred. End-of-treatment (surgery +/- adjuvant chemotherapy) ctDNA detection was associated with a 5-year RFS of 0% compared to 75.6% for patients with an undetectable end-of-treatment ctDNA (HR 14.9; 95% CI 4.94 to 44.7; P < 0.001). Key limitations of the study include the small sample size and the potential for false-positive findings with multiple hypothesis testing.
Conclusions
We confirmed the prognostic impact of postsurgery and posttreatment ctDNA in patients with resected CRLM. The potential utility of serial ctDNA analysis during adjuvant chemotherapy as an early marker of treatment efficacy was also demonstrated. Further studies are required to define how to optimally integrate ctDNA analyses into decision-making regarding the use and timing of adjuvant therapy for resectable CRLM.
Trial registration
ACTRN12612000345886.



PLoS Med: 29 Apr 2021; 18:e1003620
Tie J, Wang Y, Cohen J, Li L, ... Vogelstein B, Gibbs P
PLoS Med: 29 Apr 2021; 18:e1003620 | PMID: 33939694
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Financial incentives and deposit contracts to promote HIV retesting in Uganda: A randomized trial.

Chamie G, Kwarisiima D, Ndyabakira A, Marson K, ... Kamya MR, Thirumurthy H
Background
Frequent retesting for HIV among persons at increased risk of HIV infection is critical to early HIV diagnosis of persons and delivery of combination HIV prevention services. There are few evidence-based interventions for promoting frequent retesting for HIV. We sought to determine the effectiveness of financial incentives and deposit contracts in promoting quarterly HIV retesting among adults at increased risk of HIV.
Methods and findings
In peri-urban Ugandan communities from October to December 2018, we randomized HIV-negative adults with self-reported risk to 1 of 3 strategies to promote HIV retesting: (1) no incentive; (2) cash incentives (US$7) for retesting at 3 and 6 months (total US$14); or (3) deposit contracts: participants could voluntarily deposit US$6 at baseline and at 3 months that would be returned with interest (total US$7) upon retesting at 3 and 6 months (total US$14) or lost if participants failed to retest. The primary outcome was retesting for HIV at both 3 and 6 months. Of 1,482 persons screened for study eligibility following community-based recruitment, 524 participants were randomized to either no incentive (N = 180), incentives (N = 172), or deposit contracts (N = 172): median age was 25 years (IQR: 22 to 30), 44% were women, and median weekly income was US$13.60 (IQR: US$8.16 to US$21.76). Among participants randomized to deposit contracts, 24/172 (14%) made a baseline deposit, and 2/172 (1%) made a 3-month deposit. In intent-to-treat analyses, HIV retesting at both 3 and 6 months was significantly higher in the incentive arm (89/172 [52%]) than either the control arm (33/180 [18%], odds ratio (OR) 4.8, 95% CI: 3.0 to 7.7, p < 0.001) or the deposit contract arm (28/172 [16%], OR 5.5, 95% CI: 3.3 to 9.1, p < 0.001). Among those in the deposit contract arm who made a baseline deposit, 20/24 (83%) retested at 3 months; 11/24 (46%) retested at both 3 and 6 months. Among 282 participants who retested for HIV during the trial, three (1%; 95%CI: 0.2 to 3%) seroconverted: one in the incentive group and two in the control group. Study limitations include measurement of retesting at the clinic where baseline enrollment occurred, only offering clinic-based (rather than community-based) HIV retesting and lack of measurement of retesting after completion of the trial to evaluate sustained retesting behavior.
Conclusions
Offering financial incentives to high-risk adults in Uganda resulted in significantly higher HIV retesting. Deposit contracts had low uptake and overall did not increase retesting. As part of efforts to increase early diagnosis of HIV among high-risk populations, strategic use of incentives to promote retesting should receive greater consideration by HIV programs.
Trial registration
clinicaltrials.gov: NCT02890459.



PLoS Med: 29 Apr 2021; 18:e1003630
Chamie G, Kwarisiima D, Ndyabakira A, Marson K, ... Kamya MR, Thirumurthy H
PLoS Med: 29 Apr 2021; 18:e1003630 | PMID: 33945526
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Components of clean delivery kits and newborn mortality in the Zambia Chlorhexidine Application Trial (ZamCAT): An observational study.

Park JH, Hamer DH, Mbewe R, Scott NA, ... Yeboah-Antwi K, Semrau KEA
Background
Neonatal infection, a leading cause of neonatal death in low- and middle-income countries, is often caused by pathogens acquired during childbirth. Clean delivery kits (CDKs) have shown efficacy in reducing infection-related perinatal and neonatal mortality. However, there remain gaps in our current knowledge, including the effect of individual components, the timeline of protection, and the benefit of CDKs in home and facility deliveries.
Methods and findings
A post hoc secondary analysis was performed using nonrandomized data from the Zambia Chlorhexidine Application Trial (ZamCAT), a community-based, cluster-randomized controlled trial of chlorhexidine umbilical cord care in Southern Province of Zambia from February 2011 to January 2013. CDKs, containing soap, gloves, cord clamps, plastic sheet, razor blade, matches, and candle, were provided to all pregnant women. Field monitors made a home-based visit to each participant 4 days postpartum, during which CDK use and newborn outcomes were ascertained. Logistic regression was used to study the association between different CDK components and neonatal mortality rate (NMR). Of 38,579 deliveries recorded during the study, 36,996 newborns were analyzed after excluding stillbirths and those with missing information. Gloves, cord clamps, and plastic sheets were the most frequently used CDK item combination in both home and facility deliveries. Each of the 7 CDK components was associated with lower NMR in users versus nonusers. Adjusted logistic regression showed that use of gloves (odds ratio [OR] 0.33, 95% CI 0.24-0.46), cord clamp (OR 0.51, 95% CI 0.38-0.68), plastic sheet (OR 0.46, 95% CI 0.34-0.63), and razor blade (OR 0.69, 95% CI 0.53-0.89) were associated with lower risk of newborn mortality. Use of gloves and cord clamp were associated with reduced risk of immediate newborn death (<24 hours). Reduction in risk of early newborn death (1-6 days) was associated with use of gloves, cord clamps, plastic sheets, and razor blades. In examining perinatal mortality (stillbirth plus neonatal death in the first 7 days of life), similar patterns were observed. There was no significant reduction in risk of late newborn mortality (7-28 days) with CDK use. Study limitations included potential recall bias of CDK use and inability to establish causality, as this was a secondary observational study.
Conclusions
CDK use was associated with reductions in early newborn mortality at both home and facility deliveries, especially when certain kit components were used. While causality could not be established in this nonrandomized secondary analysis, given these beneficial associations, scaling up the use of CDKs in rural areas of sub-Saharan Africa may improve neonatal outcomes.
Trial registration
Name of trial: Zambia Chlorhexidine Application Trial (ZamCAT) Name of registry: Clinicaltrials.gov Trial number: NCT01241318.



PLoS Med: 29 Apr 2021; 18:e1003610
Park JH, Hamer DH, Mbewe R, Scott NA, ... Yeboah-Antwi K, Semrau KEA
PLoS Med: 29 Apr 2021; 18:e1003610 | PMID: 33951036
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Digital adherence technology for tuberculosis treatment supervision: A stepped-wedge cluster-randomized trial in Uganda.

Cattamanchi A, Crowder R, Kityamuwesi A, Kiwanuka N, ... Dowdy D, Katamba A
Background
Adherence to and completion of tuberculosis (TB) treatment remain problematic in many high-burden countries. 99DOTS is a low-cost digital adherence technology that could increase TB treatment completion.
Methods and findings
We conducted a pragmatic stepped-wedge cluster-randomized trial including all adults treated for drug-susceptible pulmonary TB at 18 health facilities across Uganda over 8 months (1 December 2018-31 July 2019). Facilities were randomized to switch from routine (control period) to 99DOTS-based (intervention period) TB treatment supervision in consecutive months. Patients were allocated to the control or intervention period based on which facility they attended and their treatment start date. Health facility staff and patients were not blinded to the intervention. The primary outcome was TB treatment completion. Due to the pragmatic nature of the trial, the primary analysis was done according to intention-to-treat (ITT) and per protocol (PP) principles. This trial is registered with the Pan African Clinical Trials Registry (PACTR201808609844917). Of 1,913 eligible patients at the 18 health facilities (1,022 and 891 during the control and intervention periods, respectively), 38.0% were women, mean (SD) age was 39.4 (14.4) years, 46.8% were HIV-infected, and most (91.4%) had newly diagnosed TB. In total, 463 (52.0%) patients were enrolled on 99DOTS during the intervention period. In the ITT analysis, the odds of treatment success were similar in the intervention and control periods (adjusted odds ratio [aOR] 1.04, 95% CI 0.68-1.58, p = 0.87). The odds of treatment success did not increase in the intervention period for either men (aOR 1.24, 95% CI 0.73-2.10) or women (aOR 0.67, 95% CI 0.35-1.29), or for either patients with HIV infection (aOR 1.51, 95% CI 0.81-2.85) or without HIV infection (aOR 0.78, 95% CI 0.46-1.32). In the PP analysis, the 99DOTS-based intervention increased the odds of treatment success (aOR 2.89, 95% CI 1.57-5.33, p = 0.001). The odds of completing the intensive phase of treatment and the odds of not being lost to follow-up were similarly improved in PP but not ITT analyses. Study limitations include the likelihood of selection bias in the PP analysis, inability to verify medication dosing in either arm, and incomplete implementation of some components of the intervention.
Conclusions
99DOTS-based treatment supervision did not improve treatment outcomes in the overall study population. However, similar treatment outcomes were achieved during the control and intervention periods, and those patients enrolled on 99DOTS achieved high treatment completion. 99DOTS-based treatment supervision could be a viable alternative to directly observed therapy for a substantial proportion of patients with TB.
Trial registration
Pan-African Clinical Trials Registry (PACTR201808609844917).



PLoS Med: 29 Apr 2021; 18:e1003628
Cattamanchi A, Crowder R, Kityamuwesi A, Kiwanuka N, ... Dowdy D, Katamba A
PLoS Med: 29 Apr 2021; 18:e1003628 | PMID: 33956802
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

A framework for prospective, adaptive meta-analysis (FAME) of aggregate data from randomised trials.

Tierney JF, Fisher DJ, Vale CL, Burdett S, ... White IR, Parmar MKB
Background
The vast majority of systematic reviews are planned retrospectively, once most eligible trials have completed and reported, and are based on aggregate data that can be extracted from publications. Prior knowledge of trial results can introduce bias into both review and meta-analysis methods, and the omission of unpublished data can lead to reporting biases. We present a collaborative framework for prospective, adaptive meta-analysis (FAME) of aggregate data to provide results that are less prone to bias. Also, with FAME, we monitor how evidence from trials is accumulating, to anticipate the earliest opportunity for a potentially definitive meta-analysis.
Methodology
We developed and piloted FAME alongside 4 systematic reviews in prostate cancer, which allowed us to refine the key principles. These are to: (1) start the systematic review process early, while trials are ongoing or yet to report; (2) liaise with trial investigators to develop a detailed picture of all eligible trials; (3) prospectively assess the earliest possible timing for reliable meta-analysis based on the accumulating aggregate data; (4) develop and register (or publish) the systematic review protocol before trials produce results and seek appropriate aggregate data; (5) interpret meta-analysis results taking account of both available and unavailable data; and (6) assess the value of updating the systematic review and meta-analysis. These principles are illustrated via a hypothetical review and their application to 3 published systematic reviews.
Conclusions
FAME can reduce the potential for bias, and produce more timely, thorough and reliable systematic reviews of aggregate data.



PLoS Med: 29 Apr 2021; 18:e1003629
Tierney JF, Fisher DJ, Vale CL, Burdett S, ... White IR, Parmar MKB
PLoS Med: 29 Apr 2021; 18:e1003629 | PMID: 33956789
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Parenting interventions to promote early child development in the first three years of life: A global systematic review and meta-analysis.

Jeong J, Franchett EE, Ramos de Oliveira CV, Rehmani K, Yousafzai AK
Background
Parents are the primary caregivers of young children. Responsive parent-child relationships and parental support for learning during the earliest years of life are crucial for promoting early child development (ECD). We conducted a global systematic review and meta-analysis to evaluate the effectiveness of parenting interventions on ECD and parenting outcomes.
Methods and findings
We searched MEDLINE, Embase, PsycINFO, CINAHL, Web of Science, and Global Health Library for peer-reviewed, published articles from database inception until November 15, 2020. We included randomized controlled trials (RCTs) of parenting interventions delivered during the first 3 years of life that evaluated at least 1 ECD outcome. At least 2 reviewers independently screened, extracted data, and assessed study quality from eligible studies. ECD outcomes included cognitive, language, motor, and socioemotional development, behavior problems, and attachment. Parenting outcomes included parenting knowledge, parenting practices, parent-child interactions, and parental depressive symptoms. We calculated intervention effect sizes as the standardized mean difference (SMD) and estimated pooled effect sizes for each outcome separately using robust variance estimation meta-analytic approaches. We used random-effects meta-regression models to assess potential effect modification by country-income level, child age, intervention content, duration, delivery, setting, and study quality. This review was registered with PROSPERO (CRD42018092458 and CRD42018092461). Of the 11,920 articles identified, we included 111 articles representing 102 unique RCTs. Pooled effect sizes indicated positive benefits of parenting interventions on child cognitive development (SMD = 0.32, 95% CI [confidence interval]: 0.23, 0.40, P < 0.001), language development (SMD = 0.28, 95% CI: 0.18 to 0.37, P < 0.001), motor development (SMD = 0.24, 95% CI: 0.15 to 0.32, P < 0.001), socioemotional development (SMD = 0.19, 95% CI: 0.10 to 0.28, P < 0.001), and attachment (SMD = 0.29, 95% CI: 0.18 to 0.40, P < 0.001) and reductions in behavior problems (SMD = -0.13, 95% CI: -0.18 to -0.08, P < 0.001). Positive benefits were also found on parenting knowledge (SMD = 0.56, 95% CI: 0.33 to 0.79, P < 0.001), parenting practices (SMD = 0.33, 95% CI: 0.22 to 0.44, P < 0.001), and parent-child interactions (SMD = 0.39, 95% CI: 0.24 to 0.53, P < 0.001). However, there was no significant reduction in parental depressive symptoms (SMD = -0.07, 95% CI: -0.16 to 0.02, P = 0.08). Subgroup analyses revealed significantly greater effects on child cognitive, language, and motor development, and parenting practices in low- and middle-income countries compared to high-income countries; and significantly greater effects on child cognitive development, parenting knowledge, parenting practices, and parent-child interactions for programs that focused on responsive caregiving compared to those that did not. On the other hand, there was no clear evidence of effect modification by child age, intervention duration, delivery, setting, or study risk of bias. Study limitations include considerable unexplained heterogeneity, inadequate reporting of intervention content and implementation, and varying quality of evidence in terms of the conduct of trials and robustness of outcome measures used across studies.
Conclusions
Parenting interventions for children during the first 3 years of life are effective for improving ECD outcomes and enhancing parenting outcomes across low-, middle-, and high-income countries. Increasing implementation of effective and high-quality parenting interventions is needed globally and at scale in order to support parents and enable young children to achieve their full developmental potential.



PLoS Med: 29 Apr 2021; 18:e1003602
Jeong J, Franchett EE, Ramos de Oliveira CV, Rehmani K, Yousafzai AK
PLoS Med: 29 Apr 2021; 18:e1003602 | PMID: 33970913
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Risk of miscarriage in women with chronic diseases in Norway: A registry linkage study.

Magnus MC, Morken NH, Wensaas KA, Wilcox AJ, Håberg SE
Background
Increased risk of miscarriage has been reported for women with specific chronic health conditions. A broader investigation of chronic diseases and miscarriage risk may uncover patterns across categories of illness. The objective of this study was to study the risk of miscarriage according to various preexisting chronic diseases.
Methods and findings
We conducted a registry-based study. Registered pregnancies (n = 593,009) in Norway between 2010 and 2016 were identified through 3 national health registries (birth register, general practitioner data, and patient registries). Six broad categories of illness were identified, comprising 25 chronic diseases defined by diagnostic codes used in general practitioner and patient registries. We required that the diseases were diagnosed before the pregnancy of interest. Miscarriage risk according to underlying chronic diseases was estimated as odds ratios (ORs) using generalized estimating equations adjusting for woman\'s age. The mean age of women at the start of pregnancy was 29.7 years (SD 5.6 years). We observed an increased risk of miscarriage among women with cardiometabolic diseases (OR 1.25, 95% CI 1.20 to 1.31; p-value <0.001). Within this category, risks were elevated for all conditions: atherosclerosis (2.22; 1.42 to 3.49; p-value <0.001), hypertensive disorders (1.19; 1.13 to 1.26; p-value <0.001), and type 2 diabetes (1.38; 1.26 to 1.51; p-value <0.001). Among other categories of disease, risks were elevated for hypoparathyroidism (2.58; 1.35 to 4.92; p-value 0.004), Cushing syndrome (1.97; 1.06 to 3.65; p-value 0.03), Crohn\'s disease (OR 1.31; 95% CI: 1.18 to 1.45; p-value 0.001), and endometriosis (1.22; 1.15 to 1.29; p-value <0.001). Findings were largely unchanged after mutual adjustment. Limitations of this study include our inability to adjust for measures of socioeconomic position or lifestyle characteristics, in addition to the rareness of some of the conditions providing limited power.
Conclusions
In this registry study, we found that, although risk of miscarriage was largely unaffected by maternal chronic diseases, risk of miscarriage was associated with conditions related to cardiometabolic health. This finding is consistent with emerging evidence linking cardiovascular risk factors to pregnancy complications.



PLoS Med: 29 Apr 2021; 18:e1003603
Magnus MC, Morken NH, Wensaas KA, Wilcox AJ, Håberg SE
PLoS Med: 29 Apr 2021; 18:e1003603 | PMID: 33970911
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Associations of treated and untreated human papillomavirus infection with preterm delivery and neonatal mortality: A Swedish population-based study.

Wiik J, Nilsson S, Kärrberg C, Strander B, Jacobsson B, Sengpiel V
Background
Treatment of cervical intraepithelial neoplasia (CIN) is associated with an increased risk of preterm delivery (PTD) although the exact pathomechanism is not yet understood. Women with untreated CIN also seem to have an increased risk of PTD. It is unclear whether this is attributable to human papillomavirus (HPV) infection or other factors. We aimed to investigate whether HPV infection shortly before or during pregnancy, as well as previous treatment for CIN, is associated with an increased risk of PTD and other adverse obstetric and neonatal outcomes.
Methods and findings
This was a retrospective population-based register study of women with singleton deliveries registered in the Swedish Medical Birth Register 1999-2016 (n = 1,044,023). The study population had a mean age of 30.2 years (SD 5.2) and a mean body mass index of 25.4 kg/m2 (SD 3.0), and 44% of the women were nulliparous before delivery. Study groups were defined based on cervical HPV tests, cytology, and histology, as registered in the Swedish National Cervical Screening Registry. Women with a history of exclusively normal cytology (n = 338,109) were compared to women with positive HPV tests (n = 2,550) or abnormal cytology (n = 11,727) within 6 months prior to conception or during the pregnancy, women treated for CIN3 before delivery (n = 23,185), and women with CIN2+ diagnosed after delivery (n = 33,760). Study groups were compared concerning obstetric and neonatal outcomes by logistic regression, and comparisons were adjusted for socioeconomic and health-related confounders. HPV infection was associated with PTD (adjusted odds ratio [aOR] 1.19, 95% CI 1.01-1.42, p = 0.042), preterm prelabor rupture of membranes (pPROM) (aOR 1.52, 95% CI 1.18-1.96, p < 0.001), prelabor rupture of membranes (PROM) (aOR 1.24, 95% CI 1.08-1.42, p = 0.002), and neonatal mortality (aOR 2.69, 95% CI 1.25-5.78, p = 0.011). Treatment for CIN was associated with PTD (aOR 1.85, 95% CI 1.76-1.95, p < 0.001), spontaneous PTD (aOR 2.06, 95% CI 1.95-2.17, p < 0.001), pPROM (aOR 2.36, 95% CI 2.19-2.54, p < 0.001), PROM (aOR 1.11, 95% CI 1.05-1.17, p < 0.001), intrauterine fetal death (aOR 1.35, 95% CI 1.05-1.72, p = 0.019), chorioamnionitis (aOR 2.75, 95% CI 2.33-3.23, p < 0.001), intrapartum fever (aOR 1.24, 95% CI 1.07-1.44, p = 0.003), neonatal sepsis (aOR 1.55, 95% CI 1.37-1.75, p < 0.001), and neonatal mortality (aOR 1.79, 95% CI 1.30-2.45, p < 0.001). Women with CIN2+ diagnosed within 3 years after delivery had increased PTD risk (aOR 1.18, 95% CI 1.10-1.27, p < 0.001). Limitations of the study include the retrospective design and the fact that because HPV test results only became available in 2007, abnormal cytology was used as a proxy for HPV infection.
Conclusions
In this study, we found that HPV infection shortly before or during pregnancy was associated with PTD, pPROM, PROM, and neonatal mortality. Previous treatment for CIN was associated with even greater risks for PTD and pPROM and was also associated with PROM, neonatal mortality, and maternal and neonatal infectious complications.



PLoS Med: 29 Apr 2021; 18:e1003641
Wiik J, Nilsson S, Kärrberg C, Strander B, Jacobsson B, Sengpiel V
PLoS Med: 29 Apr 2021; 18:e1003641 | PMID: 33970907
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Cost-effectiveness evidence of mental health prevention and promotion interventions: A systematic review of economic evaluations.

Le LK, Esturas AC, Mihalopoulos C, Chiotelis O, ... Chatterton ML, Engel L
Background
The prevention of mental disorders and promotion of mental health and well-being are growing fields. Whether mental health promotion and prevention interventions provide value for money in children, adolescents, adults, and older adults is unclear. The aim of the current study is to update 2 existing reviews of cost-effectiveness studies in this field in order to determine whether such interventions are cost-effective.
Methods and findings
Electronic databases (including MEDLINE, PsycINFO, CINAHL, and EconLit through EBSCO and Embase) were searched for published cost-effectiveness studies of prevention of mental disorders and promotion of mental health and well-being from 2008 to 2020. The quality of studies was assessed using the Quality of Health Economic Studies Instrument (QHES). The protocol was registered with PROSPERO (# CRD42019127778). The primary outcomes were incremental cost-effectiveness ratio (ICER) or return on investment (ROI) ratio across all studies. A total of 65 studies met the inclusion criteria of a full economic evaluation, of which, 23 targeted children and adolescents, 35 targeted adults, while the remaining targeted older adults. A large number of studies focused on prevention of depression and/or anxiety disorders, followed by promotion of mental health and well-being and other mental disorders. Although there was high heterogeneity in terms of the design among included economic evaluations, most studies consistently found that interventions for mental health prevention and promotion were cost-effective or cost saving. The review found that targeted prevention was likely to be cost-effective compared to universal prevention. Screening plus psychological interventions (e.g., cognitive behavioural therapy [CBT]) at school were the most cost-effective interventions for prevention of mental disorders in children and adolescents, while parenting interventions and workplace interventions had good evidence in mental health promotion. There is inconclusive evidence for preventive interventions for mental disorders or mental health promotion in older adults. While studies were of general high quality, there was limited evidence available from low- and middle-income countries. The review was limited to studies where mental health was the primary outcome and may have missed general health promoting strategies that could also prevent mental disorder or promote mental health. Some ROI studies might not be included given that these studies are commonly published in grey literature rather than in the academic literature.
Conclusions
Our review found a significant growth of economic evaluations in prevention of mental disorders or promotion of mental health and well-being over the last 10 years. Although several interventions for mental health prevention and promotion provide good value for money, the varied quality as well as methodologies used in economic evaluations limit the generalisability of conclusions about cost-effectiveness. However, the finding that the majority of studies especially in children, adolescents, and adults demonstrated good value for money is promising. Research on cost-effectiveness in low-middle income settings is required.
Trial registration
PROSPERO registration number: CRD42019127778.



PLoS Med: 29 Apr 2021; 18:e1003606
Le LK, Esturas AC, Mihalopoulos C, Chiotelis O, ... Chatterton ML, Engel L
PLoS Med: 29 Apr 2021; 18:e1003606 | PMID: 33974641
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Effect of community-led delivery of HIV self-testing on HIV testing and antiretroviral therapy initiation in Malawi: A cluster-randomised trial.

Indravudh PP, Fielding K, Kumwenda MK, Nzawa R, ... Terris-Prestholt F, Corbett EL
Background
Undiagnosed HIV infection remains substantial in key population subgroups including adolescents, older adults, and men, driving ongoing transmission in sub-Saharan Africa. We evaluated the impact, safety, and costs of community-led delivery of HIV self-testing (HIVST), aiming to increase HIV testing in underserved subgroups and stimulate demand for antiretroviral therapy (ART).
Methods and findings
This cluster-randomised trial, conducted between October 2018 and July 2019, used restricted randomisation (1:1) to allocate 30 group village head clusters in Mangochi district, Malawi to the community-led HIVST intervention in addition to the standard of care (SOC) or the SOC alone. The intervention involved mobilising community health groups to lead the design and implementation of 7-day HIVST campaigns, with cluster residents (≥15 years) eligible for HIVST. The primary outcome compared lifetime HIV testing among adolescents (15 to 19 years) between arms. Secondary outcomes compared: recent HIV testing (in the last 3 months) among older adults (≥40 years) and men; cumulative 6-month incidence of ART initiation per 100,000 population; knowledge of the preventive benefits of HIV treatment; and HIV testing stigma. Outcomes were measured through a post-intervention survey and at neighboring health facilities. Analysis used intention-to-treat for cluster-level outcomes. Community health groups delivered 24,316 oral fluid-based HIVST kits. The survey included 90.2% (3,960/4,388) of listed participants in the 15 community-led HIVST clusters and 89.2% (3,920/4,394) of listed participants in the 15 SOC clusters. Overall, the proportion of men was 39.0% (3,072/7,880). Most participants obtained primary-level education or below, were married, and reported a sexual partner. Lifetime HIV testing among adolescents was higher in the community-led HIVST arm (84.6%, 770/910) than the SOC arm (67.1%, 582/867; adjusted risk difference [RD] 15.2%, 95% CI 7.5% to 22.9%; p < 0.001), especially among 15 to 17 year olds and boys. Recent testing among older adults was also higher in the community-led HIVST arm (74.5%, 869/1,166) than the SOC arm (31.5%, 350/1,111; adjusted RD 42.1%, 95% CI 34.9% to 49.4%; p < 0.001). Similarly, the proportions of recently tested men were 74.6% (1,177/1,577) and 33.9% (507/1,495) in the community-led HIVST and SOC arms, respectively (adjusted RD 40.2%, 95% CI 32.9% to 47.4%; p < 0.001). Knowledge of HIV treatment benefits and HIV testing stigma showed no differences between arms. Cumulative incidence of ART initiation was respectively 305.3 and 226.1 per 100,000 population in the community-led HIVST and SOC arms (RD 72.3, 95% CI -36.2 to 180.8; p = 0.18). In post hoc analysis, ART initiations in the 3-month post-intervention period were higher in the community-led HIVST arm than the SOC arm (RD 97.7, 95% CI 33.4 to 162.1; p = 0.004). HIVST uptake was 74.7% (2,956/3,960), with few adverse events (0.6%, 18/2,955) and at US$5.70 per HIVST kit distributed. The main limitations include the use of self-reported HIV testing outcomes and lack of baseline measurement for the primary outcome.
Conclusions
In this study, we found that community-led HIVST was effective, safe, and affordable, with population impact and coverage rapidly realised at low cost. This approach could enable community HIV testing in high HIV prevalence settings and demonstrates potential for economies of scale and scope.
Trial registration
Clinicaltrials.gov NCT03541382.



PLoS Med: 29 Apr 2021; 18:e1003608
Indravudh PP, Fielding K, Kumwenda MK, Nzawa R, ... Terris-Prestholt F, Corbett EL
PLoS Med: 29 Apr 2021; 18:e1003608 | PMID: 33974621
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Adverse childhood experiences, adult depression, and suicidal ideation in rural Uganda: A cross-sectional, population-based study.

Satinsky EN, Kakuhikire B, Baguma C, Rasmussen JD, ... Bangsberg DR, Tsai AC
Background
Depression is recognized globally as a leading cause of disability. Early-life adverse childhood experiences (ACEs) have been shown to have robust associations with poor mental health during adulthood. These effects may be cumulative, whereby a greater number of ACEs are progressively associated with worse outcomes. This study aimed to estimate the associations between ACEs and adult depression and suicidal ideation in a cross-sectional, population-based study of adults in Uganda.
Methods and findings
Between 2016 and 2018, research assistants visited the homes of 1,626 adult residents of Nyakabare Parish, a rural area in southwestern Uganda. ACEs were assessed using a modified version of the Adverse Childhood Experiences-International Questionnaire, and depression symptom severity and suicidal ideation were assessed using the Hopkins Symptom Checklist for Depression (HSCL-D). We applied a validated algorithm to determine major depressive disorder diagnoses. Overall, 1,458 participants (90%) had experienced at least one ACE, 159 participants (10%) met criteria for major depressive disorder, and 28 participants (1.7%) reported suicidal ideation. We fitted regression models to estimate the associations between cumulative number of ACEs and depression symptom severity (linear regression model) and major depressive disorder and suicidal ideation (Poisson regression models). In multivariable regression models adjusted for age, sex, primary school completion, marital status, self-reported HIV status, and household asset wealth, the cumulative number of ACEs was associated with greater depression symptom severity (b = 0.050; 95% confidence interval [CI], 0.039-0.061, p < 0.001) and increased risk for major depressive disorder (adjusted relative risk [ARR] = 1.190; 95% CI, 1.109-1.276; p < 0.001) and suicidal ideation (ARR = 1.146; 95% CI, 1.001-1.311; p = 0.048). We assessed the robustness of our findings by probing for nonlinearities and conducting analyses stratified by age. The limitations of the study include the reliance on retrospective self-report as well as the focus on ACEs that occurred within the household.
Conclusions
In this whole-population, cross-sectional study of adults in rural Uganda, the cumulative number of ACEs had statistically significant associations with depression symptom severity, major depressive disorder, and suicidal ideation. These findings highlight the importance of developing and implementing policies and programs that safeguard children, promote mental health, and prevent trajectories toward psychosocial disability.



PLoS Med: 29 Apr 2021; 18:e1003642
Satinsky EN, Kakuhikire B, Baguma C, Rasmussen JD, ... Bangsberg DR, Tsai AC
PLoS Med: 29 Apr 2021; 18:e1003642 | PMID: 33979329
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Outcomes and costs of publicly funded patient navigation interventions to enhance HIV care continuum outcomes in the United States: A before-and-after study.

Shade SB, Kirby VB, Stephens S, Moran L, ... Steward WT, Myers JJ
Background
In the United States, patients with HIV face significant barriers to linkage to and retention in care which impede the necessary steps toward achieving the desired clinical outcome of viral suppression. Individual-level interventions, such as patient navigation, are evidence based, effective strategies for improving care engagement. In addition, use of surveillance and clinical data to identify patients who are not fully engaged in care may improve the effectiveness and cost-effectiveness of these programs.
Methods and findings
We employed a pre-post design to estimate the outcomes and costs, from the program perspective, of 5 state-level demonstration programs funded under the Health Resources and Services Administration\'s Special Projects of National Significance Program (HRSA/SPNS) Systems Linkages Initiative that employed existing surveillance and/or clinical data to identify individuals who had never entered HIV care, had fallen out of care, or were at risk of falling out of care and navigation strategies to engage patients in HIV care. Outcomes and costs were measured relative to standard of care during the first year of implementation of the interventions (2013 to 2014). We followed patients to estimate the number and proportion of additional patients linked, reengaged, retained, and virally suppressed by 12 months after enrollment in the interventions. We employed inverse probability weighting to adjust for differences in patient characteristics across programs, missing data, and loss to follow-up. We estimated the additional costs expended during the first year of each intervention and the cost per outcome of each intervention as the additional cost per HIV additional care continuum target achieved (cost per patient linked, reengaged, retained, and virally suppressed) 12 months after enrollment in each intervention. In this study, 3,443 patients were enrolled in Louisiana (LA), Massachusetts (MA), North Carolina (NC), Virginia (VA), and Wisconsin (WI) (147, 151, 2,491, 321, and 333, respectively). Patients were a mean of 40 years old, 75% male, and African American (69%) or Caucasian (22%). At baseline, 24% were newly diagnosed, 2% had never been in HIV care, 45% had fallen out of care, and 29% were at risk of falling out of care. All 5 interventions were associated with increases in the number and proportion of patients with viral suppression [percent increase: LA = 90.9%, 95% confidence interval (CI) = 88.4 to 93.4; MA = 78.1%, 95% CI = 72.4 to 83.8; NC = 47.5%, 95% CI = 45.2 to 49.8; VA = 54.6, 95% CI = 49.4 to 59.9; WI = 58.4, 95% CI = 53.4 to 63.4]. Overall, interventions cost an additional $4,415 (range = $3,746 to $5,619), $2,009 (range = $1,516 to $2,274), $920 (range = $627 to $941), $2,212 (range = $1,789 to $2,683), and $3,700 ($2,734 to $4,101), respectively per additional patient virally suppressed. The results of this study are limited in that we did not have contemporaneous controls for each intervention; thus, we are only able to assess patients against themselves at baseline and not against standard of care during the same time period.
Conclusions
Patient navigation programs were associated with improvements in engagement of patients in HIV care and viral suppression. Cost per outcome was minimized in states that utilized surveillance data to identify individuals who were out of care and/or those that were able to identify a larger number of patients in need of improvement at baseline. These results have the potential to inform the targeting and design of future navigation-type interventions.



PLoS Med: 29 Apr 2021; 18:e1003418
Shade SB, Kirby VB, Stephens S, Moran L, ... Steward WT, Myers JJ
PLoS Med: 29 Apr 2021; 18:e1003418 | PMID: 33983925
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Assessment of the causal relevance of ECG parameters for risk of atrial fibrillation: A mendelian randomisation study.

Gajendragadkar PR, Von Ende A, Ibrahim M, Valdes-Marquez E, ... Casadei B, Hopewell JC
Background
Atrial electrical and structural remodelling in older individuals with cardiovascular risk factors has been associated with changes in surface electrocardiographic (ECG) parameters (e.g., prolongation of the PR interval) and higher risks of atrial fibrillation (AF). However, it has been difficult to establish whether altered ECG parameters are the cause or a consequence of the myocardial substrate leading to AF. This study aimed to examine the potential causal relevance of ECG parameters on risk of AF using mendelian randomisation (MR).
Methods and findings
Weighted genetic scores explaining lifelong differences in P-wave duration, PR interval, and QT interval were constructed, and associations between these ECG scores and risk of AF were estimated among 278,792 UK Biobank participants (mean age: 57 years at recruitment; 19,132 AF cases). The independent genetic variants contributing to each of the separate ECG scores, and their corresponding weights, were based on published genome-wide association studies. In UK Biobank, genetic scores representing a 5 ms longer P-wave duration or PR interval were significantly associated with lower risks of AF (odds ratio [OR] 0.91; 95% confidence interval [CI]: 0.87-0.96, P = 2 × 10-4 and OR 0.94; 95% CI: 0.93-0.96, P = 2 × 10-19, respectively), while longer QT interval was not significantly associated with AF. These effects were independently replicated among a further 17,931 AF cases from the AFGen Consortium. Investigation of potential mechanistic pathways showed that differences in ECG parameters associated with specific ion channel genes had effects on risk of AF consistent with the overall scores, while the overall scores were not associated with changes in left atrial size. Limitations of the study included the inherent assumptions of MR, restriction to individuals of European ancestry, and possible restriction of results to the normal ECG ranges represented in UK Biobank.
Conclusions
In UK Biobank, we observed evidence suggesting a causal relationship between lifelong differences in ECG parameters (particularly PR interval) that reflect longer atrial conduction times and a lower risk of AF. These findings, which appear to be independent of atrial size and concomitant cardiovascular comorbidity, support the relevance of varying mechanisms underpinning AF and indicate that more individualised treatment strategies warrant consideration.



PLoS Med: 29 Apr 2021; 18:e1003572
Gajendragadkar PR, Von Ende A, Ibrahim M, Valdes-Marquez E, ... Casadei B, Hopewell JC
PLoS Med: 29 Apr 2021; 18:e1003572 | PMID: 33983917
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Bile acid synthesis, modulation, and dementia: A metabolomic, transcriptomic, and pharmacoepidemiologic study.

Varma VR, Wang Y, An Y, Varma S, ... Gadalla SM, Thambisetty M
Background
While Alzheimer disease (AD) and vascular dementia (VaD) may be accelerated by hypercholesterolemia, the mechanisms underlying this association are unclear. We tested whether dysregulation of cholesterol catabolism, through its conversion to primary bile acids (BAs), was associated with dementia pathogenesis.
Methods and findings
We used a 3-step study design to examine the role of the primary BAs, cholic acid (CA), and chenodeoxycholic acid (CDCA) as well as their principal biosynthetic precursor, 7α-hydroxycholesterol (7α-OHC), in dementia. In Step 1, we tested whether serum markers of cholesterol catabolism were associated with brain amyloid accumulation, white matter lesions (WMLs), and brain atrophy. In Step 2, we tested whether exposure to bile acid sequestrants (BAS) was associated with risk of dementia. In Step 3, we examined plausible mechanisms underlying these findings by testing whether brain levels of primary BAs and gene expression of their principal receptors are altered in AD. Step 1: We assayed serum concentrations CA, CDCA, and 7α-OHC and used linear regression and mixed effects models to test their associations with brain amyloid accumulation (N = 141), WMLs, and brain atrophy (N = 134) in the Baltimore Longitudinal Study of Aging (BLSA). The BLSA is an ongoing, community-based cohort study that began in 1958. Participants in the BLSA neuroimaging sample were approximately 46% male with a mean age of 76 years; longitudinal analyses included an average of 2.5 follow-up magnetic resonance imaging (MRI) visits. We used the Alzheimer\'s Disease Neuroimaging Initiative (ADNI) (N = 1,666) to validate longitudinal neuroimaging results in BLSA. ADNI is an ongoing, community-based cohort study that began in 2003. Participants were approximately 55% male with a mean age of 74 years; longitudinal analyses included an average of 5.2 follow-up MRI visits. Lower serum concentrations of 7α-OHC, CA, and CDCA were associated with higher brain amyloid deposition (p = 0.041), faster WML accumulation (p = 0.050), and faster brain atrophy mainly (false discovery rate [FDR] p = <0.001-0.013) in males in BLSA. In ADNI, we found a modest sex-specific effect indicating that lower serum concentrations of CA and CDCA were associated with faster brain atrophy (FDR p = 0.049) in males.Step 2: In the Clinical Practice Research Datalink (CPRD) dataset, covering >4 million registrants from general practice clinics in the United Kingdom, we tested whether patients using BAS (BAS users; 3,208 with ≥2 prescriptions), which reduce circulating BAs and increase cholesterol catabolism, had altered dementia risk compared to those on non-statin lipid-modifying therapies (LMT users; 23,483 with ≥2 prescriptions). Patients in the study (BAS/LMT) were approximately 34%/38% male and with a mean age of 65/68 years; follow-up time was 4.7/5.7 years. We found that BAS use was not significantly associated with risk of all-cause dementia (hazard ratio (HR) = 1.03, 95% confidence interval (CI) = 0.72-1.46, p = 0.88) or its subtypes. We found a significant difference between the risk of VaD in males compared to females (p = 0.040) and a significant dose-response relationship between BAS use and risk of VaD (p-trend = 0.045) in males.Step 3: We assayed brain tissue concentrations of CA and CDCA comparing AD and control (CON) samples in the BLSA autopsy cohort (N = 29). Participants in the BLSA autopsy cohort (AD/CON) were approximately 50%/77% male with a mean age of 87/82 years. We analyzed single-cell RNA sequencing (scRNA-Seq) data to compare brain BA receptor gene expression between AD and CON samples from the Religious Orders Study and Memory and Aging Project (ROSMAP) cohort (N = 46). ROSMAP is an ongoing, community-based cohort study that began in 1994. Participants (AD/CON) were approximately 56%/36% male with a mean age of 85/85 years. In BLSA, we found that CA and CDCA were detectable in postmortem brain tissue samples and were marginally higher in AD samples compared to CON. In ROSMAP, we found sex-specific differences in altered neuronal gene expression of BA receptors in AD. Study limitations include the small sample sizes in the BLSA cohort and likely inaccuracies in the clinical diagnosis of dementia subtypes in primary care settings.
Conclusions
We combined targeted metabolomics in serum and amyloid positron emission tomography (PET) and MRI of the brain with pharmacoepidemiologic analysis to implicate dysregulation of cholesterol catabolism in dementia pathogenesis. We observed that lower serum BA concentration mainly in males is associated with neuroimaging markers of dementia, and pharmacological lowering of BA levels may be associated with higher risk of VaD in males. We hypothesize that dysregulation of BA signaling pathways in the brain may represent a plausible biologic mechanism underlying these results. Together, our observations suggest a novel mechanism relating abnormalities in cholesterol catabolism to risk of dementia.



PLoS Med: 29 Apr 2021; 18:e1003615
Varma VR, Wang Y, An Y, Varma S, ... Gadalla SM, Thambisetty M
PLoS Med: 29 Apr 2021; 18:e1003615 | PMID: 34043628
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Effects of psychosocial support interventions on survival in inpatient and outpatient healthcare settings: A meta-analysis of 106 randomized controlled trials.

Smith TB, Workman C, Andrews C, Barton B, ... Petersen D, Holt-Lunstad J
Background
Hospitals, clinics, and health organizations have provided psychosocial support interventions for medical patients to supplement curative care. Prior reviews of interventions augmenting psychosocial support in medical settings have reported mixed outcomes. This meta-analysis addresses the questions of how effective are psychosocial support interventions in improving patient survival and which potential moderating features are associated with greater effectiveness.
Methods and findings
We evaluated randomized controlled trials (RCTs) of psychosocial support interventions in inpatient and outpatient healthcare settings reporting survival data, including studies reporting disease-related or all-cause mortality. Literature searches included studies reported January 1980 through October 2020 accessed from Embase, Medline, Cochrane Library, CINAHL, Alt HealthWatch, PsycINFO, Social Work Abstracts, and Google Scholar databases. At least 2 reviewers screened studies, extracted data, and assessed study quality, with at least 2 independent reviewers also extracting data and assessing study quality. Odds ratio (OR) and hazard ratio (HR) data were analyzed separately using random effects weighted models. Of 42,054 studies searched, 106 RCTs including 40,280 patients met inclusion criteria. Patient average age was 57.2 years, with 52% females and 48% males; 42% had cardiovascular disease (CVD), 36% had cancer, and 22% had other conditions. Across 87 RCTs reporting data for discrete time periods, the average was OR = 1.20 (95% CI = 1.09 to 1.31, p < 0.001), indicating a 20% increased likelihood of survival among patients receiving psychosocial support compared to control groups receiving standard medical care. Among those studies, psychosocial interventions explicitly promoting health behaviors yielded improved likelihood of survival, whereas interventions without that primary focus did not. Across 22 RCTs reporting survival time, the average was HR = 1.29 (95% CI = 1.12 to 1.49, p < 0.001), indicating a 29% increased probability of survival over time among intervention recipients compared to controls. Among those studies, meta-regressions identified 3 moderating variables: control group type, patient disease severity, and risk of research bias. Studies in which control groups received health information/classes in addition to treatment as usual (TAU) averaged weaker effects than those in which control groups received only TAU. Studies with patients having relatively greater disease severity tended to yield smaller gains in survival time relative to control groups. In one of 3 analyses, studies with higher risk of research bias tended to report better outcomes. The main limitation of the data is that interventions very rarely blinded personnel and participants to study arm, such that expectations for improvement were not controlled.
Conclusions
In this meta-analysis, OR data indicated that psychosocial behavioral support interventions promoting patient motivation/coping to engage in health behaviors improved patient survival, but interventions focusing primarily on patients\' social or emotional outcomes did not prolong life. HR data indicated that psychosocial interventions, predominantly focused on social or emotional outcomes, improved survival but yielded similar effects to health information/classes and were less effective among patients with apparently greater disease severity. Risk of research bias remains a plausible threat to data interpretation.



PLoS Med: 29 Apr 2021; 18:e1003595
Smith TB, Workman C, Andrews C, Barton B, ... Petersen D, Holt-Lunstad J
PLoS Med: 29 Apr 2021; 18:e1003595 | PMID: 34003832
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Adherence at 2 years with distribution of essential medicines at no charge: The CLEAN Meds randomized clinical trial.

Persaud N, Bedard M, Boozary A, Glazier RH, ... Laupacis A, Carefully seLected and Easily Accessible at No Charge Medications (CLEAN Meds) study team
Background
Adherence to medicines is low for a variety of reasons, including the cost borne by patients. Some jurisdictions publicly fund medicines for the general population, but many jurisdictions do not, and such policies are contentious. To our knowledge, no trials studying free access to a wide range of medicines have been conducted.
Methods and findings
We randomly assigned 786 primary care patients who reported not taking medicines due to cost between June 1, 2016 and April 28, 2017 to either free distribution of essential medicines (n = 395) or to usual medicine access (n = 391). The trial was conducted in Ontario, Canada, where hospital care and physician services are publicly funded for the general population but medicines are not. The trial population was mostly female (56%), younger than 65 years (83%), white (66%), and had a low income from wages as the primary source (56%). The primary outcome was medicine adherence after 2 years. Secondary outcomes included control of diabetes, blood pressure, and low-density lipoprotein (LDL) cholesterol in patients taking relevant treatments and healthcare costs over 2 years. Adherence to all appropriate prescribed medicines was 38.7% in the free distribution group and 28.6% in the usual access group after 2 years (absolute difference 10.1%; 95% confidence interval (CI) 3.3 to 16.9, p = 0.004). There were no statistically significant differences in control of diabetes (hemoglobin A1c 0.27; 95% CI -0.25 to 0.79, p = 0.302), systolic blood pressure (-3.9; 95% CI -9.9 to 2.2, p = 0.210), or LDL cholesterol (0.26; 95% CI -0.08 to 0.60, p = 0.130) based on available data. Total healthcare costs over 2 years were lower with free distribution (difference in median CAN$1,117; 95% CI CAN$445 to CAN$1,778, p = 0.006). In the free distribution group, 51 participants experienced a serious adverse event, while 68 participants in the usual access group experienced a serious adverse event (p = 0.091). Participants were not blinded, and some outcomes depended on participant reports.
Conclusions
In this study, we observed that free distribution of essential medicines to patients with cost-related nonadherence substantially increased adherence, did not affect surrogate health outcomes, and reduced total healthcare costs over 2 years.
Trial registration
ClinicalTrials.gov NCT02744963.



PLoS Med: 29 Apr 2021; 18:e1003590
Persaud N, Bedard M, Boozary A, Glazier RH, ... Laupacis A, Carefully seLected and Easily Accessible at No Charge Medications (CLEAN Meds) study team
PLoS Med: 29 Apr 2021; 18:e1003590 | PMID: 34019540
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

California and federal school nutrition policies and obesity among children of Pacific Islander, American Indian/Alaska Native, and Filipino origins: Interrupted time series analysis.

Matsuzaki M, Sánchez BN, Rebanal RD, Gittelsohn J, Sanchez-Vaznaugh EV
Background
Obesity prevalence remains high among children of Pacific Islander (PI) origin, Filipino (FI), and American Indian/Alaska Native (AIAN) origins in the United States. While school nutrition policies may help prevent and reduce childhood obesity, their influences specifically among PI, FI, and AIAN children remain understudied. We evaluated the association of the California (CA) state school nutrition policies for competitive food and beverages and the federal policy for school meals (Healthy, Hunger-Free Kids Act of 2010 (HHFKA 2010)) with overweight/obesity among PI, FI, and AIAN students.
Methods and findings
We used an interrupted time series (ITS) design with FitnessGram data from 2002 to 2016 for PI (78,841), FI (328,667), AIAN (97,129), and White (3,309,982) students in fifth and seventh grades who attended CA public schools. Multilevel logistic regression models estimated the associations of the CA school nutrition policies (in effect beginning in academic year 2004 to 2005) and HHFKA 2010 (from academic year 2012 to 2013) with overweight/obesity prevalence (above the 85 percentile of the age- and sex-specific body mass index (BMI) distribution). The models were constructed separately for each grade and sex combination and adjusted for school district-, school-, and student-level characteristics such as percentage of students eligible for free and reduced price meals, neighborhood income and education levels, and age. Across the study period, the crude prevalence of overweight/obesity was higher among PI (39.5% to 52.5%), FI (32.9% to 36.7%), and AIAN (37.7% to 45.6%) children, compared to White (26.8% to 30.2%) students. The results generally showed favorable association of the CA nutrition policies with overweight/obesity prevalence trends, although the magnitudes of associations and strengths of evidence varied among racial/ethnic subgroups. Before the CA policies went into effect (2002 to 2004), overweight/obesity prevalence increased for White, PI, and AIAN students in both grades and sex groups as well as FI girls in seventh grade. After the CA policies took place (2005 to 2012), the overweight/obesity rates decreased for almost all subgroups who experienced increasing trends before the policies, with the largest decrease seen among PI girls in fifth grade (before: log odds ratio = 0.149 (95% CI 0.108 to 0.189; p < 0.001); after: 0.010 (-0.005 to 0.025; 0.178)). When both the CA nutrition policies and HHFKA 2010 were in effect (2013 to 2016), declines in the overweight/obesity prevalence were seen among White girls and FI boys in fifth grade. Despite the evidence of the favorable association of the school nutrition policies with overweight/obesity prevalence trends, disparities between PI and AIAN students and their White peers remained large after the policies took place. As these policies went into effect for all public schools in CA, without a clear comparison group, we cannot conclude that the changes in prevalence trends were solely attributable to these policies.
Conclusions
The current study found evidence of favorable associations of the state and federal school nutrition policies with overweight/obesity prevalence trends. However, the prevalence of overweight/obesity continued to be high among PI and AIAN students and FI boys. There remain wide racial/ethnic disparities between these racial/ethnic minority subgroups and their White peers. Additional strategies are needed to reduce childhood obesity and related disparities among these understudied racial/ethnic populations.



PLoS Med: 29 Apr 2021; 18:e1003596
Matsuzaki M, Sánchez BN, Rebanal RD, Gittelsohn J, Sanchez-Vaznaugh EV
PLoS Med: 29 Apr 2021; 18:e1003596 | PMID: 34029318
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Positron emission tomography and magnetic resonance imaging in experimental human malaria to identify organ-specific changes in morphology and glucose metabolism: A prospective cohort study.

Woodford J, Gillman A, Jenvey P, Roberts J, ... Anstey NM, McCarthy JS
Background
Plasmodium vivax has been proposed to infect and replicate in the human spleen and bone marrow. Compared to Plasmodium falciparum, which is known to undergo microvascular tissue sequestration, little is known about the behavior of P. vivax outside of the circulating compartment. This may be due in part to difficulties in studying parasite location and activity in life.
Methods and findings
To identify organ-specific changes during the early stages of P. vivax infection, we performed 18-F fluorodeoxyglucose (FDG) positron emission tomography/magnetic resonance imaging (PET/MRI) at baseline and just prior to onset of clinical illness in P. vivax experimentally induced blood-stage malaria (IBSM) and compared findings to P. falciparum IBSM. Seven healthy, malaria-naive participants were enrolled from 3 IBSM trials: NCT02867059, ACTRN12616000174482, and ACTRN12619001085167. Imaging took place between 2016 and 2019 at the Herston Imaging Research Facility, Australia. Postinoculation imaging was performed after a median of 9 days in both species (n = 3 P. vivax; n = 4 P. falciparum). All participants were aged between 19 and 23 years, and 6/7 were male. Splenic volume (P. vivax: +28.8% [confidence interval (CI) +10.3% to +57.3%], P. falciparum: +22.9 [CI -15.3% to +61.1%]) and radiotracer uptake (P. vivax: +15.5% [CI -0.7% to +31.7%], P. falciparum: +5.5% [CI +1.4% to +9.6%]) increased following infection with each species, but more so in P. vivax infection (volume: p = 0.72, radiotracer uptake: p = 0.036). There was no change in FDG uptake in the bone marrow (P. vivax: +4.6% [CI -15.9% to +25.0%], P. falciparum: +3.2% [CI -3.2% to +9.6%]) or liver (P. vivax: +6.2% [CI -8.7% to +21.1%], P. falciparum: -1.4% [CI -4.6% to +1.8%]) following infection with either species. In participants with P. vivax, hemoglobin, hematocrit, and platelet count decreased from baseline at the time of postinoculation imaging. Decrements in hemoglobin and hematocrit were significantly greater in participants with P. vivax infection compared to P. falciparum. The main limitations of this study are the small sample size and the inability of this tracer to differentiate between host and parasite metabolic activity.
Conclusions
PET/MRI indicated greater splenic tropism and metabolic activity in early P. vivax infection compared to P. falciparum, supporting the hypothesis of splenic accumulation of P. vivax very early in infection. The absence of uptake in the bone marrow and liver suggests that, at least in early infection, these tissues do not harbor a large parasite biomass or do not provoke a prominent metabolic response. PET/MRI is a safe and noninvasive method to evaluate infection-associated organ changes in morphology and glucose metabolism.



PLoS Med: 29 Apr 2021; 18:e1003567
Woodford J, Gillman A, Jenvey P, Roberts J, ... Anstey NM, McCarthy JS
PLoS Med: 29 Apr 2021; 18:e1003567 | PMID: 34038421
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Evaluation of splenic accumulation and colocalization of immature reticulocytes and Plasmodium vivax in asymptomatic malaria: A prospective human splenectomy study.

Kho S, Qotrunnada L, Leonardo L, Andries B, ... Buffet PA, Anstey NM
Background
A very large biomass of intact asexual-stage malaria parasites accumulates in the spleen of asymptomatic human individuals infected with Plasmodium vivax. The mechanisms underlying this intense tropism are not clear. We hypothesised that immature reticulocytes, in which P. vivax develops, may display high densities in the spleen, thereby providing a niche for parasite survival.
Methods and findings
We examined spleen tissue in 22 mostly untreated individuals naturally exposed to P. vivax and Plasmodium falciparum undergoing splenectomy for any clinical indication in malaria-endemic Papua, Indonesia (2015 to 2017). Infection, parasite and immature reticulocyte density, and splenic distribution were analysed by optical microscopy, flow cytometry, and molecular assays. Nine non-endemic control spleens from individuals undergoing spleno-pancreatectomy in France (2017 to 2020) were also examined for reticulocyte densities. There were no exclusion criteria or sample size considerations in both patient cohorts for this demanding approach. In Indonesia, 95.5% (21/22) of splenectomy patients had asymptomatic splenic Plasmodium infection (7 P. vivax, 13 P. falciparum, and 1 mixed infection). Significant splenic accumulation of immature CD71 intermediate- and high-expressing reticulocytes was seen, with concentrations 11 times greater than in peripheral blood. Accordingly, in France, reticulocyte concentrations in the splenic effluent were higher than in peripheral blood. Greater rigidity of reticulocytes in splenic than in peripheral blood, and their higher densities in splenic cords both suggest a mechanical retention process. Asexual-stage P. vivax-infected erythrocytes of all developmental stages accumulated in the spleen, with non-phagocytosed parasite densities 3,590 times (IQR: 2,600 to 4,130) higher than in circulating blood, and median total splenic parasite loads 81 (IQR: 14 to 205) times greater, accounting for 98.7% (IQR: 95.1% to 98.9%) of the estimated total-body P. vivax biomass. More reticulocytes were in contact with sinus lumen endothelial cells in P. vivax- than in P. falciparum-infected spleens. Histological analyses revealed 96% of P. vivax rings/trophozoites and 46% of schizonts colocalised with 92% of immature reticulocytes in the cords and sinus lumens of the red pulp. Larger splenic cohort studies and similar investigations in untreated symptomatic malaria are warranted.
Conclusions
Immature CD71+ reticulocytes and splenic P. vivax-infected erythrocytes of all asexual stages accumulate in the same splenic compartments, suggesting the existence of a cryptic endosplenic lifecycle in chronic P. vivax infection. Findings provide insight into P. vivax-specific adaptions that have evolved to maximise survival and replication in the spleen.



PLoS Med: 29 Apr 2021; 18:e1003632
Kho S, Qotrunnada L, Leonardo L, Andries B, ... Buffet PA, Anstey NM
PLoS Med: 29 Apr 2021; 18:e1003632 | PMID: 34038413
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Efficacy of live attenuated and inactivated influenza vaccines among children in rural India: A 2-year, randomized, triple-blind, placebo-controlled trial.

Krishnan A, Dar L, Saha S, Narayan VV, ... Widdowson MA, Jain S
Background
Influenza is a cause of febrile acute respiratory infection (FARI) in India; however, few influenza vaccine trials have been conducted in India. We assessed absolute and relative efficacy of live attenuated influenza vaccine (LAIV) and inactivated influenza vaccine (IIV) among children aged 2 to 10 years in rural India through a randomized, triple-blind, placebo-controlled trial conducted over 2 years.
Methods and findings
In June 2015, children were randomly allocated to LAIV, IIV, intranasal placebo, or inactivated polio vaccine (IPV) in a 2:2:1:1 ratio. In June 2016, vaccination was repeated per original allocation. Overall, 3,041 children received LAIV (n = 1,015), IIV (n = 1,010), nasal placebo (n = 507), or IPV (n = 509). Mean age of children was 6.5 years with 20% aged 9 to 10 years. Through weekly home visits, nasal and throat swabs were collected from children with FARI and tested for influenza virus by polymerase chain reaction. The primary outcome was laboratory-confirmed influenza-associated FARI; vaccine efficacy (VE) was calculated using modified intention-to-treat (mITT) analysis by Cox proportional hazards model (PH) for each year. In Year 1, VE was 40.0% (95% confidence interval (CI) 25.2 to 51.9) for LAIV and 59.0% (95% CI 47.8 to 67.9) for IIV compared with controls; relative efficacy of LAIV compared with IIV was -46.2% (95% CI -88.9 to -13.1). In Year 2, VE was 51.9% (95% CI 42.0 to 60.1) for LAIV and 49.9% (95% CI 39.2 to 58.7) for IIV; relative efficacy of LAIV compared with IIV was 4.2% (95% CI -19.9 to 23.5). No serious adverse vaccine-attributable events were reported. Study limitations include differing dosage requirements for children between nasal and injectable vaccines (single dose of LAIV versus 2 doses of IIV) in Year 1 and the fact that immunogenicity studies were not conducted.
Conclusions
In this study, we found that LAIV and IIV vaccines were safe and moderately efficacious against influenza virus infection among Indian children.
Trial registration
Clinical Trials Registry of India CTRI/2015/06/005902.



PLoS Med: 28 Apr 2021; 18:e1003609
Krishnan A, Dar L, Saha S, Narayan VV, ... Widdowson MA, Jain S
PLoS Med: 28 Apr 2021; 18:e1003609 | PMID: 33914729
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Effectiveness of a primary care-based integrated mobile health intervention for stroke management in rural China (SINEMA): A cluster-randomized controlled trial.

Yan LL, Gong E, Gu W, Turner EL, ... Wang Y, Oldenburg B
Background
Managing noncommunicable diseases through primary healthcare has been identified as the key strategy to achieve universal health coverage but is challenging in most low- and middle-income countries. Stroke is the leading cause of death and disability in rural China. This study aims to determine whether a primary care-based integrated mobile health intervention (SINEMA intervention) could improve stroke management in rural China.
Methods and findings
Based on extensive barrier analyses, contextual research, and feasibility studies, we conducted a community-based, two-arm cluster-randomized controlled trial with blinded outcome assessment in Hebei Province, rural Northern China including 1,299 stroke patients (mean age: 65.7 [SD:8.2], 42.6% females, 71.2% received education below primary school) recruited from 50 villages between June 23 and July 21, 2017. Villages were randomly assigned (1:1) to either the intervention or control arm (usual care). In the intervention arm, village doctors who were government-sponsored primary healthcare providers received training, conducted monthly follow-up visits supported by an Android-based mobile application, and received performance-based payments. Participants received monthly doctor visits and automatically dispatched daily voice messages. The primary outcome was the 12-month change in systolic blood pressure (BP). Secondary outcomes were predefined, including diastolic BP, health-related quality of life, physical activity level, self-reported medication adherence (antiplatelet, statin, and antihypertensive), and performance in \"timed up and go\" test. Analyses were conducted in the intention-to-treat framework at the individual level with clusters and stratified design accounted for by following the prepublished statistical analysis plan. All villages completed the 12-month follow-up, and 611 (intervention) and 615 (control) patients were successfully followed (3.4% lost to follow-up among survivors). The program was implemented with high fidelity, and the annual program delivery cost per capita was US$24.3. There was a significant reduction in systolic BP in the intervention as compared with the control group with an adjusted mean difference: -2.8 mm Hg (95% CI -4.8, -0.9; p = 0.005). The intervention was significantly associated with improvements in 6 out of 7 secondary outcomes in diastolic BP reduction (p < 0.001), health-related quality of life (p = 0.008), physical activity level (p < 0.001), adherence in statin (p = 0.003) and antihypertensive medicines (p = 0.039), and performance in \"timed up and go\" test (p = 0.022). We observed reductions in all exploratory outcomes, including stroke recurrence (4.4% versus 9.3%; risk ratio [RR] = 0.46, 95% CI 0.32, 0.66; risk difference [RD] = 4.9 percentage points [pp]), hospitalization (4.4% versus 9.3%; RR = 0.45, 95% CI 0.32, 0.62; RD = 4.9 pp), disability (20.9% versus 30.2%; RR = 0.65, 95% CI 0.53, 0.79; RD = 9.3 pp), and death (1.8% versus 3.1%; RR = 0.52, 95% CI 0.28, 0.96; RD = 1.3 pp). Limitations include the relatively short study duration of only 1 year and the generalizability of our findings beyond the study setting.
Conclusions
In this study, a primary care-based mobile health intervention integrating provider-centered and patient-facing technology was effective in reducing BP and improving stroke secondary prevention in a resource-limited rural setting in China.
Trial registration
The SINEMA trial is registered at ClinicalTrials.gov NCT03185858.



PLoS Med: 27 Apr 2021; 18:e1003582
Yan LL, Gong E, Gu W, Turner EL, ... Wang Y, Oldenburg B
PLoS Med: 27 Apr 2021; 18:e1003582 | PMID: 33909607
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

International gestational age-specific centiles for blood pressure in pregnancy from the INTERGROWTH-21st Project in 8 countries: A longitudinal cohort study.

Green LJ, Kennedy SH, Mackillop L, Gerry S, ... Watkinson P, International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st)
Background
Gestational hypertensive and acute hypotensive disorders are associated with maternal morbidity and mortality worldwide. However, physiological blood pressure changes in pregnancy are insufficiently defined. We describe blood pressure changes across healthy pregnancies from the International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st) Fetal Growth Longitudinal Study (FGLS) to produce international, gestational age-specific, smoothed centiles (third, 10th, 50th, 90th, and 97th) for blood pressure.
Methods and findings
Secondary analysis of a prospective, longitudinal, observational cohort study (2009 to 2016) was conducted across 8 diverse urban areas in Brazil, China, India, Italy, Kenya, Oman, the United Kingdom, and the United States of America. We enrolled healthy women at low risk of pregnancy complications. We measured blood pressure using standardised methodology and validated equipment at enrolment at <14 weeks, then every 5 ± 1 weeks until delivery. We enrolled 4,607 (35%) women of 13,108 screened. The mean maternal age was 28·4 (standard deviation [SD] 3.9) years; 97% (4,204/4,321) of women were married or living with a partner, and 68% (2,955/4,321) were nulliparous. Their mean body mass index (BMI) was 23.3 (SD 3.0) kg/m2. Systolic blood pressure was lowest at 12 weeks: Median was 111.5 (95% CI 111.3 to 111.8) mmHg, rising to a median maximum of 119.6 (95% CI 118.9 to 120.3) mmHg at 40 weeks\' gestation, a difference of 8.1 (95% CI 7.4 to 8.8) mmHg. Median diastolic blood pressure decreased from 12 weeks: 69.1 (95% CI 68.9 to 69.3) mmHg to a minimum of 68.5 (95% CI 68.3 to 68.7) mmHg at 19+5 weeks\' gestation, a change of -0·6 (95% CI -0.8 to -0.4) mmHg. Diastolic blood pressure subsequently increased to a maximum of 76.3 (95% CI 75.9 to 76.8) mmHg at 40 weeks\' gestation. Systolic blood pressure fell by >14 mmHg or diastolic blood pressure by >11 mmHg in fewer than 10% of women at any gestational age. Fewer than 10% of women increased their systolic blood pressure by >24 mmHg or diastolic blood pressure by >18 mmHg at any gestational age. The study\'s main limitations were the unavailability of prepregnancy blood pressure values and inability to explore circadian effects because time of day was not recorded for the blood pressure measurements.
Conclusions
Our findings provide international, gestational age-specific centiles and limits of acceptable change to facilitate earlier recognition of deteriorating health in pregnant women. These centiles challenge the idea of a clinically significant midpregnancy drop in blood pressure.



PLoS Med: 26 Apr 2021; 18:e1003611
Green LJ, Kennedy SH, Mackillop L, Gerry S, ... Watkinson P, International Fetal and Newborn Growth Consortium for the 21st Century (INTERGROWTH-21st)
PLoS Med: 26 Apr 2021; 18:e1003611 | PMID: 33905424
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Estimating and characterizing the burden of multimorbidity in the community: A comprehensive multistep analysis of two large nationwide representative surveys in France.

Coste J, Valderas JM, Carcaillon-Bentata L
Background
Given the increasing burden of chronic conditions, multimorbidity is now a priority for healthcare and public health systems worldwide. Appropriate methodological approaches for assessing the phenomenon have not yet been established, resulting in inconsistent and incomplete descriptions. We aimed to estimate and characterize the burden of multimorbidity in the adult population in France in terms of number and type of conditions, type of underlying mechanisms, and analysis of the joint effects for identifying combinations with the most deleterious interaction effects on health status.
Methods and findings
We used a multistep approach to analyze cross-sectional and longitudinal data from 2 large nationwide representative surveys: 2010/2014 waves of the Health, Health Care, and Insurance Survey (ESPS 2010-2014) and Disability Healthcare Household Survey 2008 (HSM 2008), that collected similar data on 61 chronic or recurrent conditions. Adults aged ≥25 years in either ESPS 2010 (14,875) or HSM 2008 (23,348) were considered (participation rates were 65% and 62%, respectively). Longitudinal analyses included 7,438 participants of ESPS 2010 with follow-up for mortality (97%) of whom 3,798 were reinterviewed in 2014 (52%). Mortality, activity limitation, self-reported health, difficulties in activities/instrumental activities of daily living, and Medical Outcomes Study Short-Form 12-Item Health Survey were the health status measures. Multiple regression models were used to estimate the impact of chronic or recurrent conditions and multimorbid associations (dyads, triads, and tetrads) on health status. Etiological pathways explaining associations were investigated, and joint effects and interactions between conditions on health status measures were evaluated using both additive and multiplicative scales. Forty-eight chronic or recurrent conditions had an independent impact on mortality, activity limitations, or perceived heath. Multimorbidity prevalence varied between 30% (1-year time frame) and 39% (lifetime frame), and more markedly according to sex (higher in women), age (with greatest increases in middle-aged), and socioeconomic status (higher in less educated and low-income individuals and manual workers). We identified various multimorbid combinations, mostly involving vasculometabolic and musculoskeletal conditions and mental disorders, which could be explained by direct causation, shared or associated risk factors, or less frequently, confounding or chance. Combinations with the highest health impacts included diseases with complications but also associations of conditions affecting systems involved in locomotion and sensorial functions (impact on activity limitations), and associations including mental disorders (impact on perceived health). The interaction effects of the associated conditions varied on a continuum from subadditive and additive (associations involving cardiometabolic conditions, low back pain, osteoporosis, injury sequelae, depression, and anxiety) to multiplicative and supermultiplicative (associations involving obesity, chronic obstructive pulmonary disease, migraine, and certain osteoarticular pathologies). Study limitations included self-reported information on chronic conditions and the insufficient power of some analyses.
Conclusions
Multimorbidity assessments should move beyond simply counting conditions and take into account the variable impacts on health status, etiological pathways, and joint effects of associated conditions. In particular, the multimorbid combinations with substantial health impacts or shared risk factors deserve closer attention. Our findings also suggest that multimorbidity assessment and management may be beneficial already in midlife and probably earlier in disadvantaged groups.



PLoS Med: 25 Apr 2021; 18:e1003584
Coste J, Valderas JM, Carcaillon-Bentata L
PLoS Med: 25 Apr 2021; 18:e1003584 | PMID: 33901171
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Maternal weight change from prepregnancy to 18 months postpartum and subsequent risk of hypertension and cardiovascular disease in Danish women: A cohort study.

Kirkegaard H, Bliddal M, Støvring H, Rasmussen KM, ... Sørensen TIA, Nøhr EA
Background
One-fourth of women experience substantially higher weight years after childbirth. We examined weight change from prepregnancy to 18 months postpartum according to subsequent maternal risk of hypertension and cardiovascular disease (CVD).
Methods and findings
We conducted a cohort study of 47,966 women with a live-born singleton within the Danish National Birth Cohort (DNBC; 1997-2002). Interviews during pregnancy and 6 and 18 months postpartum provided information on height, gestational weight gain (GWG), postpartum weights, and maternal characteristics. Information on pregnancy complications, incident hypertension, and CVD was obtained from the National Patient Register. Using Cox regression, we estimated adjusted hazard ratios (HRs; 95% confidence interval [CI]) for hypertension and CVD through 16 years of follow-up. During this period, 2,011 women were diagnosed at the hospital with hypertension and 1,321 with CVD. The women were on average 32.3 years old (range 18.0-49.2) at start of follow-up, 73% had a prepregnancy BMI <25, and 27% a prepregnancy BMI ≥25. Compared with a stable weight (±1 BMI unit), weight gains from prepregnancy to 18 months postpartum of >1-2 and >2 BMI units were associated with 25% (10%-42%), P = 0.001 and 31% (14%-52%), P < 0.001 higher risks of hypertension, respectively. These risks were similar whether weight gain presented postpartum weight retention or a new gain from 6 months to 18 months postpartum and whether GWG was below, within, or above the recommendations. For CVD, findings differed according to prepregnancy BMI. In women with normal-/underweight, weight gain >2 BMI units and weight loss >1 BMI unit were associated with 48% (17%-87%), P = 0.001 and 28% (6%-55%), P = 0.01 higher risks of CVD, respectively. Further, weight loss >1 BMI unit combined with a GWG below recommended was associated with a 70% (24%-135%), P = 0.001 higher risk of CVD. No such increased risks were observed among women with overweight/obesity (interaction by prepregnancy BMI, P = 0.01, 0.03, and 0.03, respectively). The limitations of this observational study include potential confounding by prepregnancy metabolic health and self-reported maternal weights, which may lead to some misclassification.
Conclusions
Postpartum weight retention/new gain in all mothers and postpartum weight loss in mothers with normal-/underweight may be associated with later adverse cardiovascular health.



PLoS Med: 30 Mar 2021; 18:e1003486
Kirkegaard H, Bliddal M, Støvring H, Rasmussen KM, ... Sørensen TIA, Nøhr EA
PLoS Med: 30 Mar 2021; 18:e1003486 | PMID: 33798198
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Estimated impact of tafenoquine for Plasmodium vivax control and elimination in Brazil: A modelling study.

Nekkab N, Lana R, Lacerda M, Obadia T, ... Mueller I, White M
Background
Despite recent intensification of control measures, Plasmodium vivax poses a major challenge for malaria elimination efforts. Liver-stage hypnozoite parasites that cause relapsing infections can be cleared with primaquine; however, poor treatment adherence undermines drug effectiveness. Tafenoquine, a new single-dose treatment, offers an alternative option for preventing relapses and reducing transmission. In 2018, over 237,000 cases of malaria were reported to the Brazilian health system, of which 91.5% were due to P. vivax.
Methods and findings
We evaluated the impact of introducing tafenoquine into case management practices on population-level transmission dynamics using a mathematical model of P. vivax transmission. The model was calibrated to reflect the transmission dynamics of P. vivax endemic settings in Brazil in 2018, informed by nationwide malaria case reporting data. Parameters for treatment pathways with chloroquine, primaquine, and tafenoquine with glucose-6-phosphate dehydrogenase deficiency (G6PDd) testing were informed by clinical trial data and the literature. We assumed 71.3% efficacy for primaquine and tafenoquine, a 66.7% adherence rate to the 7-day primaquine regimen, a mean 5.5% G6PDd prevalence, and 8.1% low metaboliser prevalence. The introduction of tafenoquine is predicted to improve effective hypnozoite clearance among P. vivax cases and reduce population-level transmission over time, with heterogeneous levels of impact across different transmission settings. According to the model, while achieving elimination in only few settings in Brazil, tafenoquine rollout in 2021 is estimated to improve the mean effective radical cure rate from 42% (95% uncertainty interval [UI] 41%-44%) to 62% (95% UI 54%-68%) among clinical cases, leading to a predicted 38% (95% UI 7%-99%) reduction in transmission and over 214,000 cumulative averted cases between 2021 and 2025. Higher impact is predicted in settings with low transmission, low pre-existing primaquine adherence, and a high proportion of cases in working-aged males. High-transmission settings with a high proportion of cases in children would benefit from a safe high-efficacy tafenoquine dose for children. Our methodological limitations include not accounting for the role of imported cases from outside the transmission setting, relying on reported clinical cases as a measurement of community-level transmission, and implementing treatment efficacy as a binary condition.
Conclusions
In our modelling study, we predicted that, provided there is concurrent rollout of G6PDd diagnostics, tafenoquine has the potential to reduce P. vivax transmission by improving effective radical cure through increased adherence and increased protection from new infections. While tafenoquine alone may not be sufficient for P. vivax elimination, its introduction will improve case management, prevent a substantial number of cases, and bring countries closer to achieving malaria elimination goals.



PLoS Med: 30 Mar 2021; 18:e1003535
Nekkab N, Lana R, Lacerda M, Obadia T, ... Mueller I, White M
PLoS Med: 30 Mar 2021; 18:e1003535 | PMID: 33891582
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Epidemiological, clinical, and public health response characteristics of a large outbreak of diphtheria among the Rohingya population in Cox\'s Bazar, Bangladesh, 2017 to 2019: A retrospective study.

Polonsky JA, Ivey M, Mazhar MKA, Rahman Z, ... Salam MA, White K
Background
Unrest in Myanmar in August 2017 resulted in the movement of over 700,000 Rohingya refugees to overcrowded camps in Cox\'s Bazar, Bangladesh. A large outbreak of diphtheria subsequently began in this population.
Methods and findings
Data were collected during mass vaccination campaigns (MVCs), contact tracing activities, and from 9 Diphtheria Treatment Centers (DTCs) operated by national and international organizations. These data were used to describe the epidemiological and clinical features and the control measures to prevent transmission, during the first 2 years of the outbreak. Between November 10, 2017 and November 9, 2019, 7,064 cases were reported: 285 (4.0%) laboratory-confirmed, 3,610 (51.1%) probable, and 3,169 (44.9%) suspected cases. The crude attack rate was 51.5 cases per 10,000 person-years, and epidemic doubling time was 4.4 days (95% confidence interval [CI] 4.2-4.7) during the exponential growth phase. The median age was 10 years (range 0-85), and 3,126 (44.3%) were male. The typical symptoms were sore throat (93.5%), fever (86.0%), pseudomembrane (34.7%), and gross cervical lymphadenopathy (GCL; 30.6%). Diphtheria antitoxin (DAT) was administered to 1,062 (89.0%) out of 1,193 eligible patients, with adverse reactions following among 229 (21.6%). There were 45 deaths (case fatality ratio [CFR] 0.6%). Household contacts for 5,702 (80.7%) of 7,064 cases were successfully traced. A total of 41,452 contacts were identified, of whom 40,364 (97.4%) consented to begin chemoprophylaxis; adherence was 55.0% (N = 22,218) at 3-day follow-up. Unvaccinated household contacts were vaccinated with 3 doses (with 4-week interval), while a booster dose was administered if the primary vaccination schedule had been completed. The proportion of contacts vaccinated was 64.7% overall. Three MVC rounds were conducted, with administrative coverage varying between 88.5% and 110.4%. Pentavalent vaccine was administered to those aged 6 weeks to 6 years, while tetanus and diphtheria (Td) vaccine was administered to those aged 7 years and older. Lack of adequate diagnostic capacity to confirm cases was the main limitation, with a majority of cases unconfirmed and the proportion of true diphtheria cases unknown.
Conclusions
To our knowledge, this is the largest reported diphtheria outbreak in refugee settings. We observed that high population density, poor living conditions, and fast growth rate were associated with explosive expansion of the outbreak during the initial exponential growth phase. Three rounds of mass vaccinations targeting those aged 6 weeks to 14 years were associated with only modestly reduced transmission, and additional public health measures were necessary to end the outbreak. This outbreak has a long-lasting tail, with Rt oscillating at around 1 for an extended period. An adequate global DAT stockpile needs to be maintained. All populations must have access to health services and routine vaccination, and this access must be maintained during humanitarian crises.



PLoS Med: 30 Mar 2021; 18:e1003587
Polonsky JA, Ivey M, Mazhar MKA, Rahman Z, ... Salam MA, White K
PLoS Med: 30 Mar 2021; 18:e1003587 | PMID: 33793554
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Health information technology interventions and engagement in HIV care and achievement of viral suppression in publicly funded settings in the US: A cost-effectiveness analysis.

Shade SB, Marseille E, Kirby V, Chakravarty D, ... Cajina A, Myers JJ
Background
The US National HIV/AIDS Strategy (NHAS) emphasizes the use of technology to facilitate coordination of comprehensive care for people with HIV. We examined cost-effectiveness from the health system perspective of 6 health information technology (HIT) interventions implemented during 2008 to 2012 in a Ryan White HIV/AIDS Program (RWHAP) Special Projects of National Significance (SPNS) Program demonstration project.
Methods/findings
HIT interventions were implemented at 6 sites: Bronx, New York; Durham, North Carolina; Long Beach, California; New Orleans, Louisiana; New York, New York (2 sites); and Paterson, New Jersey. These interventions included: (1) use of HIV surveillance data to identify out-of-care individuals; (2) extension of access to electronic health records (EHRs) to support service providers; (3) use of electronic laboratory ordering and prescribing; and (4) development of a patient portal. We employed standard microcosting techniques to estimate costs (in 2018 US dollars) associated with intervention implementation. Data from a sample of electronic patient records from each demonstration site were analyzed to compare prescription of antiretroviral therapy (ART), CD4 cell counts, and suppression of viral load, before and after implementation of interventions. Markov models were used to estimate additional healthcare costs and quality-adjusted life-years saved as a result of each intervention. Overall, demonstration site interventions cost $3,913,313 (range = $287,682 to $998,201) among 3,110 individuals (range = 258 to 1,181) over 3 years. Changes in the proportion of patients prescribed ART ranged from a decrease from 87.0% to 72.7% at Site 4 to an increase from 74.6% to 94.2% at Site 6; changes in the proportion of patients with 0 to 200 CD4 cells/mm3 ranged from a decrease from 20.2% to 11.0% in Site 6 to an increase from 16.7% to 30.2% in Site 2; and changes in the proportion of patients with undetectable viral load ranged from a decrease from 84.6% to 46.0% in Site 1 to an increase from 67.0% to 69.9% in Site 5. Four of the 6 interventions-including use of HIV surveillance data to identify out-of-care individuals, use of electronic laboratory ordering and prescribing, and development of a patient portal-were not only cost-effective but also cost saving ($6.87 to $14.91 saved per dollar invested). In contrast, the 2 interventions that extended access to EHRs to support service providers were not effective and, therefore, not cost-effective. Most interventions remained either cost-saving or not cost-effective under all sensitivity analysis scenarios. The intervention that used HIV surveillance data to identify out-of-care individuals was no longer cost-saving when the effect of HIV on an individual\'s health status was reduced and when the natural progression of HIV was increased. The results of this study are limited in that we did not have contemporaneous controls for each intervention; thus, we are only able to assess sites against themselves at baseline and not against standard of care during the same time period.
Conclusions
These results provide additional support for the use of HIT as a tool to enhance rapid and effective treatment of HIV to achieve sustained viral suppression. HIT has the potential to increase utilization of services, improve health outcomes, and reduce subsequent transmission of HIV.



PLoS Med: 30 Mar 2021; 18:e1003389
Shade SB, Marseille E, Kirby V, Chakravarty D, ... Cajina A, Myers JJ
PLoS Med: 30 Mar 2021; 18:e1003389 | PMID: 33826617
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Housing environment and early childhood development in sub-Saharan Africa: A cross-sectional analysis.

Gao Y, Zhang L, Kc A, Wang Y, ... Mi X, Zhou H
Background
The influence of the safety and security of environments on early childhood development (ECD) has been under-explored. Although housing might be linked to ECD by affecting a child\'s health and a parent\'s ability to provide adequate care, only a few studies have examined this factor. We hypothesized that housing environment is associated with ECD in sub-Saharan Africa (SSA).
Methods and findings
From 92,433 children aged 36 to 59 months who participated in Multiple Indicator Cluster Survey (MICS) in 20 SSA countries, 88,271 were tested for cognitive and social-emotional development using the Early Childhood Development Index (ECDI) questionnaire and were thus included in this cross-sectional analysis. Children\'s mean age was 47.2 months, and 49.8% were girls. Children were considered developmentally on track in a certain domain if they failed no more than 1 ECDI item in that domain. In each country, we used conditional logistic regression models to estimate the association between improved housing (housing with finished building materials, improved drinking water, improved sanitation facilities, and sufficient living area) and children\'s cognitive and social-emotional development, accounting for contextual effects and socioeconomic factors. Estimates from each country were pooled using random-effects meta-analyses. Subgroup analyses were conducted by the child\'s gender, maternal education, and household wealth quintiles. On-track cognitive development was associated with improved housing (odds ratio [OR] = 1.15, 95% CI 1.06 to 1.24, p < 0.001), improved drinking water (OR = 1.07, 95% CI 1.00 to 1.14, p = 0.046), improved sanitation facilities (OR = 1.15, 95% CI 1.03 to 1.28, p = 0.014), and sufficient living area (OR = 1.06, 95% CI 1.01 to 1.10, p = 0.018). On-track social-emotional development was associated with improved housing only in girls (OR = 1.14, 95% CI 1.04 to 1.25, p = 0.006). The main limitations of this study included the cross-sectional nature of the datasets and the use of the ECDI, which lacks sensitivity to measure ECD outcomes.
Conclusions
In this study, we observed that improved housing was associated with on-track cognitive development and with on-track social-emotional development in girls. These findings suggest that housing improvement in SSA may be associated not only with benefits for children\'s physical health but also with broader aspects of healthy child development.



PLoS Med: 30 Mar 2021; 18:e1003578
Gao Y, Zhang L, Kc A, Wang Y, ... Mi X, Zhou H
PLoS Med: 30 Mar 2021; 18:e1003578 | PMID: 33872322
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Glucose-6-phosphate dehydrogenase activity in individuals with and without malaria: Analysis of clinical trial, cross-sectional and case-control data from Bangladesh.

Ley B, Alam MS, Kibria MG, Marfurt J, ... Khan WA, Price RN
Background
Glucose-6-phosphate dehydrogenase (G6PD) activity is dependent upon G6PD genotype and age of the red blood cell (RBC) population, with younger RBCs having higher activity. Peripheral parasitemia with Plasmodium spp. induces hemolysis, replacing older RBCs with younger cells with higher G6PD activity. This study aimed to assess whether G6PD activity varies between individuals with and without malaria or a history of malaria.
Methods and findings
Individuals living in the Chittagong Hill Tracts of Bangladesh were enrolled into 3 complementary studies: (i) a prospective, single-arm clinical efficacy trial of patients (n = 175) with uncomplicated malaria done between 2014 and 2015, (ii) a cross-sectional survey done between 2015 and 2016 (n = 999), and (iii) a matched case-control study of aparasitemic individuals with and without a history of malaria done in 2020 (n = 506). G6PD activity was compared between individuals with and without malaria diagnosed by microscopy, rapid diagnostic test (RDT), or polymerase chain reaction (PCR), and in aparasitemic participants with and without a history of malaria. In the cross-sectional survey and clinical trial, 15.5% (182/1,174) of participants had peripheral parasitemia detected by microscopy or RDT, 3.1% (36/1,174) were positive by PCR only, and 81.4% (956/1,174) were aparasitemic. Aparasitemic individuals had significantly lower G6PD activity (median 6.9 U/g Hb, IQR 5.2-8.6) than those with peripheral parasitemia detected by microscopy or RDT (7.9 U/g Hb, IQR 6.6-9.8, p < 0.001), but G6PD activity similar to those with parasitemia detected by PCR alone (submicroscopic parasitemia) (6.1 U/g Hb, IQR 4.8-8.6, p = 0.312). In total, 7.7% (14/182) of patients with malaria had G6PD activity < 70% compared to 25.0% (248/992) of participants with submicroscopic or no parasitemia (odds ratio [OR] 0.25, 95% CI 0.14-0.44, p < 0.001). In the case-control study, the median G6PD activity was 10.3 U/g Hb (IQR 8.8-12.2) in 253 patients with a history of malaria and 10.2 U/g Hb (IQR 8.7-11.8) in 253 individuals without a history of malaria (p = 0.323). The proportion of individuals with G6PD activity < 70% was 11.5% (29/253) in the cases and 15.4% (39/253) in the controls (OR 0.7, 95% CI 0.41-1.23, p = 0.192). Limitations of the study included the non-contemporaneous nature of the clinical trial and cross-sectional survey.
Conclusions
Patients with acute malaria had significantly higher G6PD activity than individuals without malaria, and this could not be accounted for by a protective effect of G6PD deficiency. G6PD-deficient patients with malaria may have higher than expected G6PD enzyme activity and an attenuated risk of primaquine-induced hemolysis compared to the risk when not infected.



PLoS Med: 30 Mar 2021; 18:e1003576
Ley B, Alam MS, Kibria MG, Marfurt J, ... Khan WA, Price RN
PLoS Med: 30 Mar 2021; 18:e1003576 | PMID: 33891581
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Anomalously warm weather and acute care visits in patients with multiple sclerosis: A retrospective study of privately insured individuals in the US.

Elser H, Parks RM, Moghavem N, Kiang MV, ... Rehkopf DH, Casey JA
Background
As the global climate changes in response to anthropogenic greenhouse gas emissions, weather and temperature are expected to become increasingly variable. Although heat sensitivity is a recognized clinical feature of multiple sclerosis (MS), a chronic demyelinating disorder of the central nervous system, few studies have examined the implications of climate change for patients with this disease.
Methods and findings
We conducted a retrospective cohort study of individuals with MS ages 18-64 years in a nationwide United States patient-level commercial and Medicare Advantage claims database from 2003 to 2017. We defined anomalously warm weather as any month in which local average temperatures exceeded the long-term average by ≥1.5°C. We estimated the association between anomalously warm weather and MS-related inpatient, outpatient, and emergency department visits using generalized log-linear models. From 75,395,334 individuals, we identified 106,225 with MS. The majority were women (76.6%) aged 36-55 years (59.0%). Anomalously warm weather was associated with increased risk for emergency department visits (risk ratio [RR] = 1.043, 95% CI: 1.025-1.063) and inpatient visits (RR = 1.032, 95% CI: 1.010-1.054). There was limited evidence of an association between anomalously warm weather and MS-related outpatient visits (RR = 1.010, 95% CI: 1.005-1.015). Estimates were similar for men and women, strongest among older individuals, and exhibited substantial variation by season, region, and climate zone. Limitations of the present study include the absence of key individual-level measures of socioeconomic position (i.e., race/ethnicity, occupational status, and housing quality) that may determine where individuals live-and therefore the extent of their exposure to anomalously warm weather-as well as their propensity to seek treatment for neurologic symptoms.
Conclusions
Our findings suggest that as global temperatures rise, individuals with MS may represent a particularly susceptible subpopulation, a finding with implications for both healthcare providers and systems.



PLoS Med: 30 Mar 2021; 18:e1003580
Elser H, Parks RM, Moghavem N, Kiang MV, ... Rehkopf DH, Casey JA
PLoS Med: 30 Mar 2021; 18:e1003580 | PMID: 33901187
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:
Abstract

Dynamics of sputum conversion during effective tuberculosis treatment: A systematic review and meta-analysis.

Calderwood CJ, Wilson JP, Fielding KL, Harris RC, ... Theodorou H, Moore DAJ
Background
Two weeks\' isolation is widely recommended for people commencing treatment for pulmonary tuberculosis (TB). The evidence that this corresponds to clearance of potentially infectious tuberculous mycobacteria in sputum is not well established. This World Health Organization-commissioned review investigated sputum sterilisation dynamics during TB treatment.
Methods and findings
For the main analysis, 2 systematic literature searches of OvidSP MEDLINE, Embase, and Global Health, and EBSCO CINAHL Plus were conducted to identify studies with data on TB infectiousness (all studies to search date, 1 December 2017) and all randomised controlled trials (RCTs) for drug-susceptible TB (from 1 January 1990 to search date, 20 February 2018). Included articles reported on patients receiving effective treatment for culture-confirmed drug-susceptible pulmonary TB. The outcome of interest was sputum bacteriological conversion: the proportion of patients having converted by a defined time point or a summary measure of time to conversion, assessed by smear or culture. Any study design with 10 or more particpants was considered. Record sifting and data extraction were performed in duplicate. Random effects meta-analyses were performed. A narrative summary additionally describes the results of a systematic search for data evaluating infectiousness from humans to experimental animals (PubMed, all studies to 27 March 2018). Other evidence on duration of infectiousness-including studies reporting on cough dynamics, human tuberculin skin test conversion, or early bactericidal activity of TB treatments-was outside the scope of this review. The literature search was repeated on 22 November 2020, at the request of the editors, to identify studies published after the previous censor date. Four small studies reporting 3 different outcome measures were identified, which included no data that would alter the findings of the review; they are not included in the meta-analyses. Of 5,290 identified records, 44 were included. Twenty-seven (61%) were RCTs and 17 (39%) were cohort studies. Thirteen studies (30%) reported data from Africa, 12 (27%) from Asia, 6 (14%) from South America, 5 (11%) from North America, and 4 (9%) from Europe. Four studies reported data from multiple continents. Summary estimates suggested smear conversion in 9% of patients at 2 weeks (95% CI 3%-24%, 1 single study [N = 1]), and 82% of patients at 2 months of treatment (95% CI 78%-86%, N = 10). Among baseline smear-positive patients, solid culture conversion occurred by 2 weeks in 5% (95% CI 0%-14%, N = 2), increasing to 88% at 2 months (95% CI 84%-92%, N = 20). At equivalent time points, liquid culture conversion was achieved in 3% (95% CI 1%-16%, N = 1) and 59% (95% CI 47%-70%, N = 8). Significant heterogeneity was observed. Further interrogation of the data to explain this heterogeneity was limited by the lack of disaggregation of results, including by factors such as HIV status, baseline smear status, and the presence or absence of lung cavitation.
Conclusions
This systematic review found that most patients remained culture positive at 2 weeks of TB treatment, challenging the view that individuals are not infectious after this interval. Culture positivity is, however, only 1 component of infectiousness, with reduced cough frequency and aerosol generation after TB treatment initiation likely to also be important. Studies that integrate our findings with data on cough dynamics could provide a more complete perspective on potential transmission of Mycobacterium tuberculosis by individuals on treatment.
Trial registration
Systematic review registration: PROSPERO 85226.



PLoS Med: 30 Mar 2021; 18:e1003566
Calderwood CJ, Wilson JP, Fielding KL, Harris RC, ... Theodorou H, Moore DAJ
PLoS Med: 30 Mar 2021; 18:e1003566 | PMID: 33901173
Go to: DOI | PubMed | PDF | Google Scholar |
Impact:

This program is still in alpha version.