PLoS Medicine

Condividi contenuti PLOS Medicine: New Articles
A Peer-Reviewed Open-Access Journal
Aggiornato: 1 settimana 4 giorni fa

The cardiovascular effects of amodiaquine and structurally related antimalarials: An individual patient data meta-analysis

Mar, 07/09/2021 - 16:00

by Xin Hui S. Chan, Ilsa L. Haeusler, Yan Naung Win, James Pike, Borimas Hanboonkunupakarn, Maryam Hanafiah, Sue J. Lee, Abdoulaye Djimdé, Caterina I. Fanello, Jean-René Kiechel, Marcus VG Lacerda, Bernhards Ogutu, Marie A. Onyamboko, André M. Siqueira, Elizabeth A. Ashley, Walter RJ Taylor, Nicholas J. White

Background

Amodiaquine is a 4-aminoquinoline antimalarial similar to chloroquine that is used extensively for the treatment and prevention of malaria. Data on the cardiovascular effects of amodiaquine are scarce, although transient effects on cardiac electrophysiology (electrocardiographic QT interval prolongation and sinus bradycardia) have been observed. We conducted an individual patient data meta-analysis to characterise the cardiovascular effects of amodiaquine and thereby support development of risk minimisation measures to improve the safety of this important antimalarial.

Methods and findings

Studies of amodiaquine for the treatment or prevention of malaria were identified from a systematic review. Heart rates and QT intervals with study-specific heart rate correction (QTcS) were compared within studies and individual patient data pooled for multivariable linear mixed effects regression.The meta-analysis included 2,681 patients from 4 randomised controlled trials evaluating artemisinin-based combination therapies (ACTs) containing amodiaquine (n = 725), lumefantrine (n = 499), piperaquine (n = 716), and pyronaridine (n = 566), as well as monotherapy with chloroquine (n = 175) for uncomplicated malaria. Amodiaquine prolonged QTcS (mean = 16.9 ms, 95% CI: 15.0 to 18.8) less than chloroquine (21.9 ms, 18.3 to 25.6, p = 0.0069) and piperaquine (19.2 ms, 15.8 to 20.5, p = 0.0495), but more than lumefantrine (5.6 ms, 2.9 to 8.2, p < 0.001) and pyronaridine (−1.2 ms, −3.6 to +1.3, p < 0.001). In individuals aged ≥12 years, amodiaquine reduced heart rate (mean reduction = 15.2 beats per minute [bpm], 95% CI: 13.4 to 17.0) more than piperaquine (10.5 bpm, 7.7 to 13.3, p = 0.0013), lumefantrine (9.3 bpm, 6.4 to 12.2, p < 0.001), pyronaridine (6.6 bpm, 4.0 to 9.3, p < 0.001), and chloroquine (5.9 bpm, 3.2 to 8.5, p < 0.001) and was associated with a higher risk of potentially symptomatic sinus bradycardia (≤50 bpm) than lumefantrine (risk difference: 14.8%, 95% CI: 5.4 to 24.3, p = 0.0021) and chloroquine (risk difference: 8.0%, 95% CI: 4.0 to 12.0, p < 0.001). The effect of amodiaquine on the heart rate of children aged <12 years compared with other antimalarials was not clinically significant. Study limitations include the unavailability of individual patient-level adverse event data for most included participants, but no serious complications were documented.

Conclusions

While caution is advised in the use of amodiaquine in patients aged ≥12 years with concomitant use of heart rate–reducing medications, serious cardiac conduction disorders, or risk factors for torsade de pointes, there have been no serious cardiovascular events reported after amodiaquine in widespread use over 7 decades. Amodiaquine and structurally related antimalarials at the World Health Organization (WHO)-recommended doses alone or in ACTs are safe for the treatment and prevention of malaria.

Call for emergency action to limit global temperature increases, restore biodiversity, and protect health

Mar, 07/09/2021 - 16:00

by Lukoye Atwoli, Abdullah H. Baqui, Thomas Benfield, Raffaella Bosurgi, Fiona Godlee, Stephen Hancocks, Richard Horton, Laurie Laybourn-Langton, Carlos Augusto Monteiro, Ian Norman, Kirsten Patrick, Nigel Praities, Marcel GM Olde Rikkert, Eric J. Rubin, Peush Sahni, Richard Smith, Nick Talley, Sue Turale, Damián Vázquez

Derivation and external validation of a risk score for predicting HIV-associated tuberculosis to support case finding and preventive therapy scale-up: A cohort study

Mar, 07/09/2021 - 16:00

by Andrew F. Auld, Andrew D. Kerkhoff, Yasmeen Hanifa, Robin Wood, Salome Charalambous, Yuliang Liu, Tefera Agizew, Anikie Mathoma, Rosanna Boyd, Anand Date, Ray W. Shiraishi, George Bicego, Unami Mathebula-Modongo, Heather Alexander, Christopher Serumola, Goabaone Rankgoane-Pono, Pontsho Pono, Alyssa Finlay, James C. Shepherd, Tedd V. Ellerbrock, Alison D. Grant, Katherine Fielding

Background

Among people living with HIV (PLHIV), more flexible and sensitive tuberculosis (TB) screening tools capable of detecting both symptomatic and subclinical active TB are needed to (1) reduce morbidity and mortality from undiagnosed TB; (2) facilitate scale-up of tuberculosis preventive therapy (TPT) while reducing inappropriate prescription of TPT to PLHIV with subclinical active TB; and (3) allow for differentiated HIV–TB care.

Methods and findings

We used Botswana XPRES trial data for adult HIV clinic enrollees collected during 2012 to 2015 to develop a parsimonious multivariable prognostic model for active prevalent TB using both logistic regression and random forest machine learning approaches. A clinical score was derived by rescaling final model coefficients. The clinical score was developed using southern Botswana XPRES data and its accuracy validated internally, using northern Botswana data, and externally using 3 diverse cohorts of antiretroviral therapy (ART)-naive and ART-experienced PLHIV enrolled in XPHACTOR, TB Fast Track (TBFT), and Gugulethu studies from South Africa (SA). Predictive accuracy of the clinical score was compared with the World Health Organization (WHO) 4-symptom TB screen. Among 5,418 XPRES enrollees, 2,771 were included in the derivation dataset; 67% were female, median age was 34 years, median CD4 was 240 cells/μL, 189 (7%) had undiagnosed prevalent TB, and characteristics were similar between internal derivation and validation datasets. Among XPHACTOR, TBFT, and Gugulethu cohorts, median CD4 was 400, 73, and 167 cells/μL, and prevalence of TB was 5%, 10%, and 18%, respectively. Factors predictive of TB in the derivation dataset and selected for the clinical score included male sex (1 point), ≥1 WHO TB symptom (7 points), smoking history (1 point), temperature >37.5°C (6 points), body mass index (BMI) <18.5kg/m2 (2 points), and severe anemia (hemoglobin <8g/dL) (3 points). Sensitivity using WHO 4-symptom TB screen was 73%, 80%, 94%, and 94% in XPRES, XPHACTOR, TBFT, and Gugulethu cohorts, respectively, but increased to 88%, 87%, 97%, and 97%, when a clinical score of ≥2 was used. Negative predictive value (NPV) also increased 1%, 0.3%, 1.6%, and 1.7% in XPRES, XPHACTOR, TBFT, and Gugulethu cohorts, respectively, when the clinical score of ≥2 replaced WHO 4-symptom TB screen. Categorizing risk scores into low (<2), moderate (2 to 10), and high-risk categories (>10) yielded TB prevalence of 1%, 1%, 2%, and 6% in the lowest risk group and 33%, 22%, 26%, and 32% in the highest risk group for XPRES, XPHACTOR, TBFT, and Gugulethu cohorts, respectively. At clinical score ≥2, the number needed to screen (NNS) ranged from 5.0 in Gugulethu to 11.0 in XPHACTOR. Limitations include that the risk score has not been validated in resource-rich settings and needs further evaluation and validation in contemporary cohorts in Africa and other resource-constrained settings.

Conclusions

The simple and feasible clinical score allowed for prioritization of sensitivity and NPV, which could facilitate reductions in mortality from undiagnosed TB and safer administration of TPT during proposed global scale-up efforts. Differentiation of risk by clinical score cutoff allows flexibility in designing differentiated HIV–TB care to maximize impact of available resources.

Altering product placement to create a healthier layout in supermarkets: Outcomes on store sales, customer purchasing, and diet in a prospective matched controlled cluster study

Mar, 07/09/2021 - 16:00

by Christina Vogel, Sarah Crozier, Daniel Penn-Newman, Kylie Ball, Graham Moon, Joanne Lord, Cyrus Cooper, Janis Baird

Background

Previous product placement trials in supermarkets are limited in scope and outcome data collected. This study assessed the effects on store-level sales, household-level purchasing, and dietary behaviours of a healthier supermarket layout.

Methods and findings

This is a prospective matched controlled cluster trial with 2 intervention components: (i) new fresh fruit and vegetable sections near store entrances (replacing smaller displays at the back) and frozen vegetables repositioned to the entrance aisle, plus (ii) the removal of confectionery from checkouts and aisle ends opposite. In this pilot study, the intervention was implemented for 6 months in 3 discount supermarkets in England. Three control stores were matched on store sales and customer profiles and neighbourhood deprivation. Women customers aged 18 to 45 years, with loyalty cards, were assigned to the intervention (n = 62) or control group (n = 88) of their primary store. The trial registration number is NCT03518151. Interrupted time series analysis showed that increases in store-level sales of fruits and vegetables were greater in intervention stores than predicted at 3 (1.71 standard deviations (SDs) (95% CI 0.45, 2.96), P = 0.01) and 6 months follow-up (2.42 SDs (0.22, 4.62), P = 0.03), equivalent to approximately 6,170 and approximately 9,820 extra portions per store, per week, respectively. The proportion of purchasing fruits and vegetables per week rose among intervention participants at 3 and 6 months compared to control participants (0.2% versus −3.0%, P = 0.22; 1.7% versus −3.5%, P = 0.05, respectively). Store sales of confectionery were lower in intervention stores than predicted at 3 (−1.05 SDs (−1.98, −0.12), P = 0.03) and 6 months (−1.37 SDs (−2.95, 0.22), P = 0.09), equivalent to approximately 1,359 and approximately 1,575 fewer portions per store, per week, respectively; no differences were observed for confectionery purchasing. Changes in dietary variables were predominantly in the expected direction for health benefit. Intervention implementation was not within control of the research team, and stores could not be randomised. It is a pilot study, and, therefore, not powered to detect an effect.

Conclusions

Healthier supermarket layouts can improve the nutrition profile of store sales and likely improve household purchasing and dietary quality. Placing fruits and vegetables near store entrances should be considered alongside policies to limit prominent placement of unhealthy foods.

Trial registration

ClinicalTrials.gov NCT03518151 (pre-results)

Assisted reproduction technology and long-term cardiometabolic health in the offspring

Mar, 07/09/2021 - 16:00

by Ronald C. W. Ma, Noel Y. H. Ng, Lai Ping Cheung

Ronald Ma and co-authors discuss Emma Norrman and colleagues’ accompanying research study on the health of children born with assisted reproductive technology.

Cardiovascular disease, obesity, and type 2 diabetes in children born after assisted reproductive technology: A population-based cohort study

Mar, 07/09/2021 - 16:00

by Emma Norrman, Max Petzold, Mika Gissler, Anne Lærke Spangmose, Signe Opdahl, Anna-Karina Henningsen, Anja Pinborg, Aila Tiitinen, Annika Rosengren, Liv Bente Romundstad, Ulla-Britt Wennerholm, Christina Bergh

Background

Some earlier studies have found indications of significant changes in cardiometabolic risk factors in children born after assisted reproductive technology (ART). Most of these studies are based on small cohorts with high risk of selection bias. In this study, we compared the risk of cardiovascular disease, obesity, and type 2 diabetes between singleton children born after ART and singleton children born after spontaneous conception (SC).

Methods and findings

This was a large population-based cohort study of individuals born in Norway, Sweden, Finland, and Denmark between 1984 and 2015. Data were obtained from national ART and medical birth registers and cross-linked with data from national patient registers and other population-based registers in the respective countries. In total, 122,429 children born after ART and 7,574,685 children born after SC were included. Mean (SD) maternal age was 33.9 (4.3) years for ART and 29.7 (5.2) for SC, 67.7% versus 41.8% were primiparous, and 45.2% versus 32.1% had more than 12 years of education. Preterm birth (<37 weeks 0 days) occurred in 7.9% of children born after ART and 4.8% in children born after SC, and 5.7% versus 3.3% had a low birth weight (<2,500 g). Mean (SD) follow-up time was 8.6 (6.2) years for children born after ART and 14.0 (8.6) years for children born after SC. In total, 135 (0.11%), 645 (0.65%), and 18 (0.01%) children born after ART were diagnosed with cardiovascular disease (ischemic heart disease, cardiomyopathy, heart failure, or cerebrovascular disease), obesity or type 2 diabetes, respectively. The corresponding values were 10,702 (0.14%), 30,308 (0.74%), and 2,919 (0.04%) for children born after SC. In the unadjusted analysis, children born after ART had a significantly higher risk of any cardiovascular disease (hazard ratio [HR] 1.24; 95% CI 1.04–1.48; p = 0.02), obesity (HR 1.13; 95% CI 1.05–1.23; p = 0.002), and type 2 diabetes (HR 1.71; 95% CI 1.08–2.73; p = 0.02). After adjustment, there was no significant difference between children born after ART and children born after SC for any cardiovascular disease (adjusted HR [aHR]1.02; 95% CI 0.86–1.22; p = 0.80) or type 2 diabetes (aHR 1.31; 95% CI 0.82–2.09; p = 0.25). For any cardiovascular disease, the 95% CI was reasonably narrow, excluding effects of a substantial magnitude, while the 95% CI for type 2 diabetes was wide, not excluding clinically meaningful effects. For obesity, there was a small but significant increased risk among children born after ART (aHR 1.14; 95% CI 1.06–1.23; p = 0.001). Important limitations of the study were the relatively short follow-up time, the limited number of events for some outcomes, and that the outcome obesity is often not considered as a disease and therefore not caught by registers, likely leading to an underestimation of obesity in both children born after ART and children born after SC.

Conclusions

In this study, we observed no difference in the risk of cardiovascular disease or type 2 diabetes between children born after ART and children born after SC. For obesity, there was a small but significant increased risk for children born after ART.

Trial registration number

ISRCTN11780826.

The latent tuberculosis cascade-of-care among people living with HIV: A systematic review and meta-analysis

Mar, 07/09/2021 - 16:00

by Mayara Lisboa Bastos, Luca Melnychuk, Jonathon R. Campbell, Olivia Oxlade, Dick Menzies

Background

Tuberculosis preventive therapy (TPT) reduces TB-related morbidity and mortality in people living with HIV (PLHIV). Cascade-of-care analyses help identify gaps and barriers in care and develop targeted solutions. A previous latent tuberculosis infection (LTBI) cascade-of-care analysis showed only 18% of persons in at-risk populations complete TPT, but a similar analysis for TPT among PLHIV has not been completed. We conducted a meta-analysis to provide this evidence.

Methods and findings

We first screened potential articles from a LTBI cascade-of-care systematic review published in 2016. From this study, we included cohorts that reported a minimum of 25 PLHIV. To identify new cohorts, we used a similar search strategy restricted to PLHIV. The search was conducted in Medline, Embase, Health Star, and LILACS, from January 2014 to February 2021. Two authors independently screened titles and full text and assessed risk of bias using the Newcastle–Ottawa Scale for cohorts and Cochrane Risk of Bias for cluster randomized trials. We meta-analyzed the proportion of PLHIV completing each step of the LTBI cascade-of-care and estimated the cumulative proportion retained. These results were stratified based on cascades-of-care that used or did not use LTBI testing to determine eligibility for TPT. We also performed a narrative synthesis of enablers and barriers of the cascade-of-care identified at different steps of the cascade.A total of 71 cohorts were included, and 70 were meta-analyzed, comprising 94,011 PLHIV. Among the PLHIV included, 35.3% (33,139/94,011) were from the Americas and 29.2% (27,460/94,011) from Africa. Overall, 49.9% (46,903/94,011) from low- and middle-income countries, median age was 38.0 [interquartile range (IQR) 34.0;43.6], and 65.9% (46,328/70,297) were men, 43.6% (29,629/67,947) were treated with antiretroviral therapy (ART), and the median CD4 count was 390 cell/mm3 (IQR 312;458). Among the cohorts that did not use LTBI tests, the cumulative proportion of PLHIV starting and completing TPT were 40.9% (95% CI: 39.3% to 42.7%) and 33.2% (95% CI: 31.6% to 34.9%). Among cohorts that used LTBI tests, the cumulative proportions of PLHIV starting and completing TPT were 60.4% (95% CI: 58.1% to 62.6%) and 41.9% (95% CI:39.6% to 44.2%), respectively. Completion of TPT was not significantly different in high- compared to low- and middle-income countries. Regardless of LTBI test use, substantial losses in the cascade-of-care occurred before treatment initiation. The integration of HIV and TB care was considered an enabler of the cascade-of-care in multiple cohorts. Key limitations of this systematic review are the observational nature of the included studies, potential selection bias in the population selection, only 14 cohorts reported all steps of the cascade-of-care, and barriers/facilitators were not systematically reported in all cohorts.

Conclusions

Although substantial losses were seen in multiple stages of the cascade-of-care, the cumulative proportion of PLHIV completing TPT was higher than previously reported among other at-risk populations. The use of LTBI testing in PLHIV in low- and middle-income countries was associated with higher proportion of the cohorts initiating TPT and with similar rates of completion of TPT.

Changes in maternal risk factors and their association with changes in cesarean sections in Norway between 1999 and 2016: A descriptive population-based registry study

Ven, 03/09/2021 - 16:00

by Ingvild Hersoug Nedberg, Marzia Lazzerini, Ilaria Mariani, Kajsa Møllersen, Emanuelle Pessa Valente, Erik Eik Anda, Finn Egil Skjeldestad

Background

Increases in the proportion of the population with increased likelihood of cesarean section (CS) have been postulated as a driving force behind the rise in CS rates worldwide. The aim of the study was to assess if changes in selected maternal risk factors for CS are associated with changes in CS births from 1999 to 2016 in Norway.

Methods and findings

This national population-based registry study utilizes data from 1,055,006 births registered in the Norwegian Medical Birth Registry from 1999 to 2016. The following maternal risk factors for CS were included: nulliparous/≥35 years, multiparous/≥35 years, pregestational diabetes, gestational diabetes, hypertensive disorders, previous CS, assisted reproductive technology, and multiple births. The proportion of CS births in 1999 was used to predict the number of CS births in 2016. The observed and predicted numbers of CS births were compared to determine the number of excess CS births, before and after considering the selected risk factors, for all births, and for births stratified by 0, 1, or ≥1 of the selected risk factors. The proportion of CS births increased from 12.9% to 16.1% (+24.8%) during the study period. The proportion of births with 1 selected risk factor increased from 21.3% to 26.3% (+23.5%), while the proportion with >1 risk factor increased from 4.5% to 8.8% (+95.6%). Stratification by the presence of selected risk factors reduced the number of excess CS births observed in 2016 compared to 1999 by 67.9%. Study limitations include lack of access to other important maternal risk factors and only comparing the first and the last year of the study period.

Conclusions

In this study, we observed that after an initial increase, proportions of CS births remained stable from 2005 to 2016. Instead, both the size of the risk population and the mean number of risk factors per birth continued to increase. We observed a possible association between the increase in size of risk population and the additional CS births observed in 2016 compared to 1999. The increase in size of risk population and the stable CS rate from 2005 and onward may indicate consistent adherence to obstetric evidence-based practice in Norway.

Corporate political activity in the context of unhealthy food advertising restrictions across Transport for London: A qualitative case study

Gio, 02/09/2021 - 16:00

by Kathrin Lauber, Daniel Hunt, Anna B. Gilmore, Harry Rutter

Background

Diets with high proportions of foods high in fat, sugar, and/or salt (HFSS) contribute to malnutrition and rising rates of childhood obesity, with effects throughout the life course. Given compelling evidence on the detrimental impact HFSS advertising has on children’s diets, the World Health Organization unequivocally supports the adoption of restrictions on HFSS marketing and advertising. In February 2019, the Greater London Authority introduced novel restrictions on HFSS advertising across Transport for London (TfL), one of the most valuable out-of-home advertising estates. In this study, we examined whether and how commercial actors attempted to influence the development of these advertising restrictions.

Methods and findings

Using requests under the Freedom of Information Act, we obtained industry responses to the London Food Strategy consultation, correspondence between officials and key industry actors, and information on meetings. We used an existing model of corporate political activity, the Policy Dystopia Model, to systematically analyse arguments and activities used to counter the policy. The majority of food and advertising industry consultation respondents opposed the proposed advertising restrictions, many promoting voluntary approaches instead. Industry actors who supported the policy were predominantly smaller businesses. To oppose the policy, industry respondents deployed a range of strategies. They exaggerated potential costs and underplayed potential benefits of the policy, for instance, warning of negative economic consequences and questioning the evidence underlying the proposal. Despite challenging the evidence for the policy, they offered little evidence in support of their own claims. Commercial actors had significant access to the policy process and officials through the consultation and numerous meetings, yet attempted to increase access, for example, in applying to join the London Child Obesity Taskforce and inviting its members to events. They also employed coalition management, engaging directly and through business associations to amplify their arguments. Some advertising industry actors also raised the potential of legal challenges. The key limitation of this study is that our data focused on industry–policymaker interactions; thus, our findings are unable to present a comprehensive picture of political activity.

Conclusions

In this study, we identified substantial opposition from food and advertising industry actors to the TfL advertising restrictions. We mapped arguments and activities used to oppose the policy, which might help other public authorities anticipate industry efforts to prevent similar restrictions in HFSS advertising. Given the potential consequences of commercial influence in these kinds of policy spaces, public bodies should consider how they engage with industry actors.

Risk of a permanent work-related disability pension after incident venous thromboembolism in Denmark: A population-based cohort study

Mar, 31/08/2021 - 16:00

by Helle Jørgensen, Erzsébet Horváth-Puhó, Kristina Laugesen, Sigrid Brækkan, John-Bjarne Hansen, Henrik Toft Sørensen

Background

Long-term complications of venous thromboembolism (VTE) hamper physical function and impair quality of life; still, it remains unclear whether VTE is associated with risk of permanent work-related disability. We aimed to assess the association between VTE and the risk of receiving a permanent work-related disability pension and to assess whether this association was explained by comorbidities such as cancer and arterial cardiovascular disease.

Methods and findings

A Danish nationwide population-based cohort study consisting of 43,769 individuals aged 25 to 66 years with incident VTE during 1995 to 2016 and 218,845 birth year-, sex-, and calendar year-matched individuals from the general population, among whom 45.9% (N = 120,540) were women, was established using Danish national registries. The cohorts were followed throughout 2016, with permanent work-related disability pension as the outcome. Hazard ratios (HRs) with 95% confidence intervals (CIs) for disability pension were computed and stratified by sex and age groups (25 to 34, 35 to 44, 45 to 54, and 55 to 66 years of age) and adjusted for comorbidities and socioeconomic variables.Permanent work-related disability pensions were granted to 4,415 individuals with VTE and 9,237 comparison cohort members (incidence rates = 17.8 and 6.2 per 1,000 person-years, respectively). VTE was associated with a 3-fold (HR 3.0, 95% CI: 2.8 to 3.1) higher risk of receiving a disability pension. Adjustments for socioeconomic status and comorbidities such as cancer and cardiovascular diseases reduced the estimate (HR 2.3, 95% CI: 2.2 to 2.4). The risk of disability pension receipt was slightly higher in men than in women (HR 2.5, 95% CI: 2.3 to 2.6 versus HR 2.1, 95% CI: 2.0 to 2.3). As this study is based on medical and administrative registers, information on post-VTE care, individual health behavior, and workplace factors linked to disability pension in the general population are lacking. Furthermore, as disability pension schemes vary, our results might not be directly generalizable to other countries or time periods.

Conclusions

In this study, incident VTE was associated with increased risk of subsequent permanent work-related disability, and this association was still observed after accounting for comorbidities such as cancer and cardiovascular diseases. Our results emphasize the social consequences of VTE and may help occupational and healthcare professionals to identify vulnerable individuals at risk of permanent exclusion from the labor market after a VTE event.

Building global health research capacity to address research imperatives following the COVID-19 pandemic

Mar, 31/08/2021 - 16:00

by Peter H. Kilmarx, Roger I. Glass

Peter Kilmarx and Roger Glass discuss strengthening health research capabilities as a response to the COVID-19 pandemic.

Utility of ctDNA in predicting response to neoadjuvant chemoradiotherapy and prognosis assessment in locally advanced rectal cancer: A prospective cohort study

Mar, 31/08/2021 - 16:00

by Yaqi Wang, Lifeng Yang, Hua Bao, Xiaojun Fan, Fan Xia, Juefeng Wan, Lijun Shen, Yun Guan, Hairong Bao, Xue Wu, Yang Xu, Yang Shao, Yiqun Sun, Tong Tong, Xinxiang Li, Ye Xu, Sanjun Cai, Ji Zhu, Zhen Zhang

Background

For locally advanced rectal cancer (LARC) patients who receive neoadjuvant chemoradiotherapy (nCRT), there are no reliable indicators to accurately predict pathological complete response (pCR) before surgery. For patients with clinical complete response (cCR), a “Watch and Wait” (W&W) approach can be adopted to improve quality of life. However, W&W approach may increase the recurrence risk in patients who are judged to be cCR but have minimal residual disease (MRD). Magnetic resonance imaging (MRI) is a major tool to evaluate response to nCRT; however, its ability to predict pCR needs to be improved. In this prospective cohort study, we explored the value of circulating tumor DNA (ctDNA) in combination with MRI in the prediction of pCR before surgery and investigated the utility of ctDNA in risk stratification and prognostic prediction for patients undergoing nCRT and total mesorectal excision (TME).

Methods and findings

We recruited 119 Chinese LARC patients (cT3-4/N0-2/M0; median age of 57; 85 males) who were treated with nCRT plus TME at Fudan University Shanghai Cancer Center (China) from February 7, 2016 to October 31, 2017. Plasma samples at baseline, during nCRT, and after surgery were collected. A total of 531 plasma samples were collected and subjected to deep targeted panel sequencing of 422 cancer-related genes. The association among ctDNA status, treatment response, and prognosis was analyzed. The performance of ctDNA alone, MRI alone, and combining ctDNA with MRI was evaluated for their ability to predict pCR/non-pCR.Ranging from complete tumor regression (pathological tumor regression grade 0; pTRG0) to poor regression (pTRG3), the ctDNA clearance rate during nCRT showed a significant decreasing trend (95.7%, 77.8%, 71.1%, and 66.7% in pTRG 0, 1, 2, and 3 groups, respectively, P = 0.008), while the detection rate of acquired mutations in ctDNA showed an increasing trend (3.8%, 8.3%, 19.2%, and 23.1% in pTRG 0, 1, 2, and 3 groups, respectively, P = 0.02). Univariable logistic regression showed that ctDNA clearance was associated with a low probability of non-pCR (odds ratio = 0.11, 95% confidence interval [95% CI] = 0.01 to 0.6, P = 0.04). A risk score predictive model, which incorporated both ctDNA (i.e., features of baseline ctDNA, ctDNA clearance, and acquired mutation status) and MRI tumor regression grade (mrTRG), was developed and demonstrated improved performance in predicting pCR/non-pCR (area under the curve [AUC] = 0.886, 95% CI = 0.810 to 0.962) compared with models derived from only ctDNA (AUC = 0.818, 95% CI = 0.725 to 0.912) or only mrTRG (AUC = 0.729, 95% CI = 0.641 to 0.816). The detection of potential colorectal cancer (CRC) driver genes in ctDNA after nCRT indicated a significantly worse recurrence-free survival (RFS) (hazard ratio [HR] = 9.29, 95% CI = 3.74 to 23.10, P < 0.001). Patients with detectable driver mutations and positive high-risk feature (HR_feature) after surgery had the highest recurrence risk (HR = 90.29, 95% CI = 17.01 to 479.26, P < 0.001). Limitations include relatively small sample size, lack of independent external validation, no serial ctDNA testing after surgery, and a relatively short follow-up period.

Conclusions

The model combining ctDNA and MRI improved the predictive performance compared with the models derived from individual information, and combining ctDNA with HR_feature can stratify patients with a high risk of recurrence. Therefore, ctDNA can supplement MRI to better predict nCRT response, and it could potentially help patient selection for nonoperative management and guide the treatment strategy for those with different recurrence risks.

Cell-free DNA ultra-low-pass whole genome sequencing to distinguish malignant peripheral nerve sheath tumor (MPNST) from its benign precursor lesion: A cross-sectional study

Mar, 31/08/2021 - 16:00

by Jeffrey J. Szymanski, R. Taylor Sundby, Paul A. Jones, Divya Srihari, Noah Earland, Peter K. Harris, Wenjia Feng, Faridi Qaium, Haiyan Lei, David Roberts, Michele Landeau, Jamie Bell, Yi Huang, Leah Hoffman, Melissa Spencer, Matthew B. Spraker, Li Ding, Brigitte C. Widemann, Jack F. Shern, Angela C. Hirbe, Aadel A. Chaudhuri

Background

The leading cause of mortality for patients with the neurofibromatosis type 1 (NF1) cancer predisposition syndrome is the development of malignant peripheral nerve sheath tumor (MPNST), an aggressive soft tissue sarcoma. In the setting of NF1, this cancer type frequently arises from within its common and benign precursor, plexiform neurofibroma (PN). Transformation from PN to MPNST is challenging to diagnose due to difficulties in distinguishing cross-sectional imaging results and intralesional heterogeneity resulting in biopsy sampling errors.

Methods and findings

This multi-institutional study from the National Cancer Institute and Washington University in St. Louis used fragment size analysis and ultra-low-pass whole genome sequencing (ULP-WGS) of plasma cell-free DNA (cfDNA) to distinguish between MPNST and PN in patients with NF1. Following in silico enrichment for short cfDNA fragments and copy number analysis to estimate the fraction of plasma cfDNA originating from tumor (tumor fraction), we developed a noninvasive classifier that differentiates MPNST from PN with 86% pretreatment accuracy (91% specificity, 75% sensitivity) and 89% accuracy on serial analysis (91% specificity, 83% sensitivity). Healthy controls without NF1 (participants = 16, plasma samples = 16), PN (participants = 23, plasma samples = 23), and MPNST (participants = 14, plasma samples = 46) cohorts showed significant differences in tumor fraction in plasma (P = 0.001) as well as cfDNA fragment length (P < 0.001) with MPNST samples harboring shorter fragments and being enriched for tumor-derived cfDNA relative to PN and healthy controls. No other covariates were significant on multivariate logistic regression. Mutational analysis demonstrated focal NF1 copy number loss in PN and MPNST patient plasma but not in healthy controls. Greater genomic instability including alterations associated with malignant transformation (focal copy number gains in chromosome arms 1q, 7p, 8q, 9q, and 17q; focal copy number losses in SUZ12, SMARCA2, CDKN2A/B, and chromosome arms 6p and 9p) was more prominently observed in MPNST plasma. Furthermore, the sum of longest tumor diameters (SLD) visualized by cross-sectional imaging correlated significantly with paired tumor fractions in plasma from MPNST patients (r = 0.39, P = 0.024). On serial analysis, tumor fraction levels in plasma dynamically correlated with treatment response to therapy and minimal residual disease (MRD) detection before relapse. Study limitations include a modest MPNST sample size despite accrual from 2 major referral centers for this rare malignancy, and lack of uniform treatment and imaging protocols representing a real-world cohort.

Conclusions

Tumor fraction levels derived from cfDNA fragment size and copy number alteration analysis of plasma cfDNA using ULP-WGS significantly correlated with MPNST tumor burden, accurately distinguished MPNST from its benign PN precursor, and dynamically correlated with treatment response. In the future, our findings could form the basis for improved early cancer detection and monitoring in high-risk cancer-predisposed populations.

Urine tumor DNA detection of minimal residual disease in muscle-invasive bladder cancer treated with curative-intent radical cystectomy: A cohort study

Mar, 31/08/2021 - 16:00

by Pradeep S. Chauhan, Kevin Chen, Ramandeep K. Babbra, Wenjia Feng, Nadja Pejovic, Armaan Nallicheri, Peter K. Harris, Katherine Dienstbach, Andrew Atkocius, Lenon Maguire, Faridi Qaium, Jeffrey J. Szymanski, Brian C. Baumann, Li Ding, Dengfeng Cao, Melissa A. Reimers, Eric H. Kim, Zachary L. Smith, Vivek K. Arora, Aadel A. Chaudhuri

Background

The standard of care treatment for muscle-invasive bladder cancer (MIBC) is radical cystectomy, which is typically preceded by neoadjuvant chemotherapy. However, the inability to assess minimal residual disease (MRD) noninvasively limits our ability to offer bladder-sparing treatment. Here, we sought to develop a liquid biopsy solution via urine tumor DNA (utDNA) analysis.

Methods and findings

We applied urine Cancer Personalized Profiling by Deep Sequencing (uCAPP-Seq), a targeted next-generation sequencing (NGS) method for detecting utDNA, to urine cell-free DNA (cfDNA) samples acquired between April 2019 and November 2020 on the day of curative-intent radical cystectomy from 42 patients with localized bladder cancer. The average age of patients was 69 years (range: 50 to 86), of whom 76% (32/42) were male, 64% (27/42) were smokers, and 76% (32/42) had a confirmed diagnosis of MIBC. Among MIBC patients, 59% (19/32) received neoadjuvant chemotherapy. utDNA variant calling was performed noninvasively without prior sequencing of tumor tissue. The overall utDNA level for each patient was represented by the non-silent mutation with the highest variant allele fraction after removing germline variants. Urine was similarly analyzed from 15 healthy adults. utDNA analysis revealed a median utDNA level of 0% in healthy adults and 2.4% in bladder cancer patients. When patients were classified as those who had residual disease detected in their surgical sample (n = 16) compared to those who achieved a pathologic complete response (pCR; n = 26), median utDNA levels were 4.3% vs. 0%, respectively (p = 0.002). Using an optimal utDNA threshold to define MRD detection, positive utDNA MRD detection was highly correlated with the absence of pCR (p < 0.001) with a sensitivity of 81% and specificity of 81%. Leave-one-out cross-validation applied to the prediction of pathologic response based on utDNA MRD detection in our cohort yielded a highly significant accuracy of 81% (p = 0.007). Moreover, utDNA MRD–positive patients exhibited significantly worse progression-free survival (PFS; HR = 7.4; 95% CI: 1.4–38.9; p = 0.02) compared to utDNA MRD–negative patients. Concordance between urine- and tumor-derived mutations, determined in 5 MIBC patients, was 85%. Tumor mutational burden (TMB) in utDNA MRD–positive patients was inferred from the number of non-silent mutations detected in urine cfDNA by applying a linear relationship derived from The Cancer Genome Atlas (TCGA) whole exome sequencing of 409 MIBC tumors. We suggest that about 58% of these patients with high inferred TMB might have been candidates for treatment with early immune checkpoint blockade. Study limitations included an analysis restricted only to single-nucleotide variants (SNVs), survival differences diminished by surgery, and a low number of DNA damage response (DRR) mutations detected after neoadjuvant chemotherapy at the MRD time point.

Conclusions

utDNA MRD detection prior to curative-intent radical cystectomy for bladder cancer correlated significantly with pathologic response, which may help select patients for bladder-sparing treatment. utDNA MRD detection also correlated significantly with PFS. Furthermore, utDNA can be used to noninvasively infer TMB, which could facilitate personalized immunotherapy for bladder cancer in the future.

Combining simple blood tests to identify primary care patients with unexpected weight loss for cancer investigation: Clinical risk score development, internal validation, and net benefit analysis

Mar, 31/08/2021 - 16:00

by Brian D. Nicholson, Paul Aveyard, Constantinos Koshiaris, Rafael Perera, Willie Hamilton, Jason Oke, F. D. Richard Hobbs

Background

Unexpected weight loss (UWL) is a presenting feature of cancer in primary care. Existing research proposes simple combinations of clinical features (risk factors, symptoms, signs, and blood test data) that, when present, warrant cancer investigation. More complex combinations may modify cancer risk to sufficiently rule-out the need for investigation. We aimed to identify which clinical features can be used together to stratify patients with UWL based on their risk of cancer.

Methods and findings

We used data from 63,973 adults (age: mean 59 years, standard deviation 21 years; 42% male) to predict cancer in patients with UWL recorded in a large representative United Kingdom primary care electronic health record between January 1, 2000 and December 31, 2012. We derived 3 clinical prediction models using logistic regression and backwards stepwise covariate selection: Sm, symptoms-only model; STm, symptoms and tests model; Tm, tests-only model. Fifty imputations replaced missing data. Estimates of discrimination and calibration were derived using 10-fold internal cross-validation. Simple clinical risk scores are presented for models with the greatest clinical utility in decision curve analysis. The STm and Tm showed improved discrimination (area under the curve ≥ 0.91), calibration, and greater clinical utility than the Sm. The Tm was simplest including age-group, sex, albumin, alkaline phosphatase, liver enzymes, C-reactive protein, haemoglobin, platelets, and total white cell count. A Tm score of 5 balanced ruling-in (sensitivity 84.0%, positive likelihood ratio 5.36) and ruling-out (specificity 84.3%, negative likelihood ratio 0.19) further cancer investigation. A Tm score of 1 prioritised ruling-out (sensitivity 97.5%). At this threshold, 35 people presenting with UWL in primary care would be referred for investigation for each person with cancer referred, and 1,730 people would be spared referral for each person with cancer not referred. Study limitations include using a retrospective routinely collected dataset, a reliance on coding to identify UWL, and missing data for some predictors.

Conclusions

Our findings suggest that combinations of simple blood test abnormalities could be used to identify patients with UWL who warrant referral for investigation, while people with combinations of normal results could be exempted from referral.

Obesity and the relation between joint exposure to ambient air pollutants and incident type 2 diabetes: A cohort study in UK Biobank

Lun, 30/08/2021 - 16:00

by Xiang Li, Mengying Wang, Yongze Song, Hao Ma, Tao Zhou, Zhaoxia Liang, Lu Qi

Background

Air pollution has been related to incidence of type 2 diabetes (T2D). We assessed the joint association of various air pollutants with the risk of T2D and examined potential modification by obesity status and genetic susceptibility on the relationship.

Methods and findings

A total of 449,006 participants from UK Biobank free of T2D at baseline were included. Of all the study population, 90.9% were white and 45.7% were male. The participants had a mean age of 56.6 (SD 8.1) years old and a mean body mass index (BMI) of 27.4 (SD 4.8) kg/m2. Ambient air pollutants, including particulate matter (PM) with diameters ≤2.5 μm (PM2.5), between 2.5 μm and 10 μm (PM2.5–10), nitrogen oxide (NO2), and nitric oxide (NO) were measured. An air pollution score was created to assess the joint exposure to the 4 air pollutants. During a median of 11 years follow-up, we documented 18,239 incident T2D cases. The air pollution score was significantly associated with a higher risk of T2D. Compared to the lowest quintile of air pollution score, the hazard ratio (HR) (95% confidence interval [CI]) for T2D was 1.05 (0.99 to 1.10, p = 0.11), 1.06 (1.00 to 1.11, p = 0.051), 1.09 (1.03 to 1.15, p = 0.002), and 1.12 (1.06 to 1.19, p < 0.001) for the second to fifth quintile, respectively, after adjustment for sociodemographic characteristics, lifestyle factors, genetic factors, and other covariates. In addition, we found a significant interaction between the air pollution score and obesity status on the risk of T2D (p-interaction < 0.001). The observed association was more pronounced among overweight and obese participants than in the normal-weight people. Genetic risk score (GRS) for T2D or obesity did not modify the relationship between air pollution and risk of T2D. Key study limitations include unavailable data on other potential T2D-related air pollutants and single-time measurement on air pollutants.

Conclusions

We found that various air pollutants PM2.5, PM2.5–10, NO2, and NO, individually or jointly, were associated with an increased risk of T2D in the population. The stratified analyses indicate that such associations were more strongly associated with T2D risk among those with higher adiposity.

Public preferences for delayed or immediate antibiotic prescriptions in UK primary care: A choice experiment

Lun, 30/08/2021 - 16:00

by Liz Morrell, James Buchanan, Laurence S. J. Roope, Koen B. Pouwels, Christopher C. Butler, Benedict Hayhoe, Sarah Tonkin-Crine, Monsey McLeod, Julie V. Robotham, Alison Holmes, A. Sarah Walker, Sarah Wordsworth, STEPUP team

Background

Delayed (or “backup”) antibiotic prescription, where the patient is given a prescription but advised to delay initiating antibiotics, has been shown to be effective in reducing antibiotic use in primary care. However, this strategy is not widely used in the United Kingdom. This study aimed to identify factors influencing preferences among the UK public for delayed prescription, and understand their relative importance, to help increase appropriate use of this prescribing option.

Methods and findings

We conducted an online choice experiment in 2 UK general population samples: adults and parents of children under 18 years. Respondents were presented with 12 scenarios in which they, or their child, might need antibiotics for a respiratory tract infection (RTI) and asked to choose either an immediate or a delayed prescription. Scenarios were described by 7 attributes. Data were collected between November 2018 and February 2019. Respondent preferences were modelled using mixed-effects logistic regression.The survey was completed by 802 adults and 801 parents (75% of those who opened the survey). The samples reflected the UK population in age, sex, ethnicity, and country of residence. The most important determinant of respondent choice was symptom severity, especially for cough-related symptoms. In the adult sample, the probability of choosing delayed prescription was 0.53 (95% confidence interval (CI) 0.50 to 0.56, p < 0.001) for a chesty cough and runny nose compared to 0.30 (0.28 to 0.33, p < 0.001) for a chesty cough with fever, 0.47 (0.44 to 0.50, p < 0.001) for sore throat with swollen glands, and 0.37 (0.34 to 0.39, p < 0.001) for sore throat, swollen glands, and fever. Respondents were less likely to choose delayed prescription with increasing duration of illness (odds ratio (OR) 0.94 (0.92 to 0.96, p < 0.001)). Probabilities of choosing delayed prescription were similar for parents considering treatment for a child (44% of choices versus 42% for adults, p = 0.04). However, parents differed from the adult sample in showing a more marked reduction in choice of the delayed prescription with increasing duration of illness (OR 0.83 (0.80 to 0.87) versus 0.94 (0.92 to 0.96) for adults, p for heterogeneity p < 0.001) and a smaller effect of disruption of usual activities (OR 0.96 (0.95 to 0.97) versus 0.93 (0.92 to 0.94) for adults, p for heterogeneity p < 0.001). Females were more likely to choose a delayed prescription than males for minor symptoms, particularly minor cough (probability 0.62 (0.58 to 0.66, p < 0.001) for females and 0.45 (0.41 to 0.48, p < 0.001) for males). Older people, those with a good understanding of antibiotics, and those who had not used antibiotics recently showed similar patterns of preferences. Study limitations include its hypothetical nature, which may not reflect real-life behaviour; the absence of a “no prescription” option; and the possibility that study respondents may not represent the views of population groups who are typically underrepresented in online surveys.

Conclusions

This study found that delayed prescription appears to be an acceptable approach to reducing antibiotic consumption. Certain groups appear to be more amenable to delayed prescription, suggesting particular opportunities for increased use of this strategy. Prescribing choices for sore throat may need additional explanation to ensure patient acceptance, and parents in particular may benefit from reassurance about the usual duration of these illnesses.

Long-term cost-effectiveness of interventions for obesity: A mendelian randomisation study

Ven, 27/08/2021 - 16:00

by Sean Harrison, Padraig Dixon, Hayley E. Jones, Alisha R. Davies, Laura D. Howe, Neil M. Davies

Background

The prevalence of obesity has increased in the United Kingdom, and reliably measuring the impact on quality of life and the total healthcare cost from obesity is key to informing the cost-effectiveness of interventions that target obesity, and determining healthcare funding. Current methods for estimating cost-effectiveness of interventions for obesity may be subject to confounding and reverse causation. The aim of this study is to apply a new approach using mendelian randomisation for estimating the cost-effectiveness of interventions that target body mass index (BMI), which may be less affected by confounding and reverse causation than previous approaches.

Methods and findings

We estimated health-related quality-adjusted life years (QALYs) and both primary and secondary healthcare costs for 310,913 men and women of white British ancestry aged between 39 and 72 years in UK Biobank between recruitment (2006 to 2010) and 31 March 2017. We then estimated the causal effect of differences in BMI on QALYs and total healthcare costs using mendelian randomisation. For this, we used instrumental variable regression with a polygenic risk score (PRS) for BMI, derived using a genome-wide association study (GWAS) of BMI, with age, sex, recruitment centre, and 40 genetic principal components as covariables to estimate the effect of a unit increase in BMI on QALYs and total healthcare costs. Finally, we used simulations to estimate the likely effect on BMI of policy relevant interventions for BMI, then used the mendelian randomisation estimates to estimate the cost-effectiveness of these interventions.A unit increase in BMI decreased QALYs by 0.65% of a QALY (95% confidence interval [CI]: 0.49% to 0.81%) per year and increased annual total healthcare costs by £42.23 (95% CI: £32.95 to £51.51) per person. When considering only health conditions usually considered in previous cost-effectiveness modelling studies (cancer, cardiovascular disease, cerebrovascular disease, and type 2 diabetes), we estimated that a unit increase in BMI decreased QALYs by only 0.16% of a QALY (95% CI: 0.10% to 0.22%) per year.We estimated that both laparoscopic bariatric surgery among individuals with BMI greater than 35 kg/m2, and restricting volume promotions for high fat, salt, and sugar products, would increase QALYs and decrease total healthcare costs, with net monetary benefits (at £20,000 per QALY) of £13,936 (95% CI: £8,112 to £20,658) per person over 20 years, and £546 million (95% CI: £435 million to £671 million) in total per year, respectively.The main limitations of this approach are that mendelian randomisation relies on assumptions that cannot be proven, including the absence of directional pleiotropy, and that genotypes are independent of confounders.

Conclusions

Mendelian randomisation can be used to estimate the impact of interventions on quality of life and healthcare costs. We observed that the effect of increasing BMI on health-related quality of life is much larger when accounting for 240 chronic health conditions, compared with only a limited selection. This means that previous cost-effectiveness studies have likely underestimated the effect of BMI on quality of life and, therefore, the potential cost-effectiveness of interventions to reduce BMI.

Importance of attributes and willingness to pay for oral anticoagulant therapy in patients with atrial fibrillation in China: A discrete choice experiment

Gio, 26/08/2021 - 16:00

by Jiaxi Zhao, Hao Wang, Xue Li, Yang Hu, Vincent K. C. Yan, Carlos K. H. Wong, Yutao Guo, Marco K. H. Cheung, Gregory Y. H. Lip, Chung-Wah Siu, Hung-Fat Tse, Esther W. Chan

Background

Adherence to oral anticoagulant therapy in patients with atrial fibrillation (AF) in China is low. Patient preference, one of the main reasons for discontinuation of oral anticoagulant therapy, is an unfamiliar concept in China.

Methods and findings

A discrete choice experiment (DCE) was conducted to quantify patient preference on 7 attributes of oral anticoagulant therapy: antidote (yes/no), food–drug interaction (yes/no), frequency of blood monitoring (no need, every 6/3/1 month[s]), risk of nonfatal major bleeding (0.7/3.1/5.5/7.8[%]), risk of nonfatal stroke (ischemic/hemorrhagic) or systemic embolism (0.6/3.2/5.8/8.4[%]), risk of nonfatal acute myocardial infarction (AMI) (0.2/1.0/1.8/2.5[%]), and monthly out-of-pocket cost (0/120/240/360 RMB) (0 to 56 USD). A total of 16 scenarios were generated by using D-Efficient design and were randomly divided into 2 blocks. Eligible patients were recruited and interviewed from outpatient and inpatient settings of 2 public hospitals in Beijing and Shenzhen, respectively. Patients were presented with 8 scenarios and asked to select 1 of 3 options: 2 unlabeled hypothetical treatments and 1 opt-out option. Mixed logit regression model was used for estimating patients’ preferences of attributes of oral anticoagulants and willingness to pay (WTP) with adjustments for age, sex, education level, income level, city, self-evaluated health score, histories of cardiovascular disease/other vascular disease/any stroke/any bleeding, and use of anticoagulant/antiplatelet therapy. A total of 506 patients were recruited between May 2018 and December 2019 (mean age 70.3 years, 42.1% women). Patients were mainly concerned about the risks of AMI (β: −1.03; 95% CI: −1.31, −0.75; p < 0.001), stroke or systemic embolism (β: −0.81; 95% CI: −0.90, −0.73; p < 0.001), and major bleeding (β: −0.69; 95% CI: −0.78, −0.60; p < 0.001) and were willing to pay more, from up to 798 RMB to 536 RMB (124 to 83 USD) monthly. The least concerning attribute was frequency of blood monitoring (β: −0.31; 95% CI: −0.39, −0.24; p < 0.001). Patients had more concerns about food–drug interactions even exceeding preferences on the 3 risks, if they had a history of stroke or bleeding (β: −2.47; 95% CI: −3.92, −1.02; p < 0.001), recruited from Beijing (β: −1.82; 95% CI: −2.56, −1.07; p < 0.001), or men (β: −0.96; 95% CI: −1.36, −0.56; p < 0.001). Patients with lower educational attainment or lower income weighted all attributes lower, and their WTP for incremental efficacy and safety was minimal. Since the patients were recruited from 2 major hospitals from developed cities in China, further studies with better representative samples would be needed.

Conclusions

Patients with AF in China were mainly concerned about the safety and effectiveness of oral anticoagulant therapy. The preference weighting on food–drug interaction varied widely. Patients with lower educational attainment or income levels and less experience of bleeding or stroke had more reservations about paying for oral anticoagulant therapies with superior efficacy, safety, and convenience of use.