Citations
All
Search in:AllTitleAbstractAuthor name
Publications
(11K+)
Patents
Grants
Pathways
Clinical trials
The language you are using is not recognised as English. To correctly search in your language please select Search and translation language
Publication
Journal: PLoS Medicine
February/1/2011
Abstract
BACKGROUND
Pharmaceutical companies spent $57.5 billion on pharmaceutical promotion in the United States in 2004. The industry claims that promotion provides scientific and educational information to physicians. While some evidence indicates that promotion may adversely influence prescribing, physicians hold a wide range of views about pharmaceutical promotion. The objective of this review is to examine the relationship between exposure to information from pharmaceutical companies and the quality, quantity, and cost of physicians' prescribing.
RESULTS
We searched for studies of physicians with prescribing rights who were exposed to information from pharmaceutical companies (promotional or otherwise). Exposures included pharmaceutical sales representative visits, journal advertisements, attendance at pharmaceutical sponsored meetings, mailed information, prescribing software, and participation in sponsored clinical trials. The outcomes measured were quality, quantity, and cost of physicians' prescribing. We searched Medline (1966 to February 2008), International Pharmaceutical Abstracts (1970 to February 2008), Embase (1997 to February 2008), Current Contents (2001 to 2008), and Central (The Cochrane Library Issue 3, 2007) using the search terms developed with an expert librarian. Additionally, we reviewed reference lists and contacted experts and pharmaceutical companies for information. Randomized and observational studies evaluating information from pharmaceutical companies and measures of physicians' prescribing were independently appraised for methodological quality by two authors. Studies were excluded where insufficient study information precluded appraisal. The full text of 255 articles was retrieved from electronic databases (7,185 studies) and other sources (138 studies). Articles were then excluded because they did not fulfil inclusion criteria (179) or quality appraisal criteria (18), leaving 58 included studies with 87 distinct analyses. Data were extracted independently by two authors and a narrative synthesis performed following the MOOSE guidelines. Of the set of studies examining prescribing quality outcomes, five found associations between exposure to pharmaceutical company information and lower quality prescribing, four did not detect an association, and one found associations with lower and higher quality prescribing. 38 included studies found associations between exposure and higher frequency of prescribing and 13 did not detect an association. Five included studies found evidence for association with higher costs, four found no association, and one found an association with lower costs. The narrative synthesis finding of variable results was supported by a meta-analysis of studies of prescribing frequency that found significant heterogeneity. The observational nature of most included studies is the main limitation of this review.
CONCLUSIONS
With rare exceptions, studies of exposure to information provided directly by pharmaceutical companies have found associations with higher prescribing frequency, higher costs, or lower prescribing quality or have not found significant associations. We did not find evidence of net improvements in prescribing, but the available literature does not exclude the possibility that prescribing may sometimes be improved. Still, we recommend that practitioners follow the precautionary principle and thus avoid exposure to information from pharmaceutical companies. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
May/6/2010
Abstract
BACKGROUND
With the rapid expansion of antiretroviral therapy (ART) services in sub-Saharan Africa there is growing recognition of the importance of fertility and childbearing among HIV-infected women. However there are few data on whether ART initiation influences pregnancy rates.
RESULTS
We analyzed data from the Mother-to-Child Transmission-Plus (MTCT-Plus) Initiative, a multicountry HIV care and treatment program for women, children, and families. From 11 programs in seven African countries, women were enrolled into care regardless of HIV disease stage and followed at regular intervals; ART was initiated according to national guidelines on the basis of immunological and/or clinical criteria. Standardized forms were used to collect sociodemographic and clinical data, including incident pregnancies. Overall 589 incident pregnancies were observed among the 4,531 women included in this analysis (pregnancy incidence, 7.8/100 person-years [PY]). The rate of new pregnancies was significantly higher among women receiving ART (9.0/100 PY) compared to women not on ART (6.5/100 PY) (adjusted hazard ratio, 1.74; 95% confidence interval, 1.19-2.54). Other factors independently associated with increased risk of incident pregnancy included younger age, lower educational attainment, being married or cohabiting, having a male partner enrolled into the program, failure to use nonbarrier contraception, and higher CD4 cell counts.
CONCLUSIONS
ART use is associated with significantly higher pregnancy rates among HIV-infected women in sub-Saharan Africa. While the possible behavioral or biomedical mechanisms that may underlie this association require further investigation, these data highlight the importance of pregnancy planning and management as a critical but neglected component of HIV care and treatment services. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
October/8/2015
Abstract
BACKGROUND
Chronic kidney disease (CKD) is a frequent, under-recognized condition and a risk factor for renal failure and cardiovascular disease. Increasing evidence connects non-alcoholic fatty liver disease (NAFLD) to CKD. We conducted a meta-analysis to determine whether the presence and severity of NAFLD are associated with the presence and severity of CKD.
RESULTS
English and non-English articles from international online databases from 1980 through January 31, 2014 were searched. Observational studies assessing NAFLD by histology, imaging, or biochemistry and defining CKD as either estimated glomerular filtration rate (eGFR) <60 ml/min/1.73 m2 or proteinuria were included. Two reviewers extracted studies independently and in duplicate. Individual participant data (IPD) were solicited from all selected studies. Studies providing IPD were combined with studies providing only aggregate data with the two-stage method. Main outcomes were pooled using random-effects models. Sensitivity and subgroup analyses were used to explore sources of heterogeneity and the effect of potential confounders. The influences of age, whole-body/abdominal obesity, homeostasis model of insulin resistance (HOMA-IR), and duration of follow-up on effect estimates were assessed by meta-regression. Thirty-three studies (63,902 participants, 16 population-based and 17 hospital-based, 20 cross-sectional, and 13 longitudinal) were included. For 20 studies (61% of included studies, 11 cross-sectional and nine longitudinal, 29,282 participants), we obtained IPD. NAFLD was associated with an increased risk of prevalent (odds ratio [OR] 2.12, 95% CI 1.69-2.66) and incident (hazard ratio [HR] 1.79, 95% CI 1.65-1.95) CKD. Non-alcoholic steatohepatitis (NASH) was associated with a higher prevalence (OR 2.53, 95% CI 1.58-4.05) and incidence (HR 2.12, 95% CI 1.42-3.17) of CKD than simple steatosis. Advanced fibrosis was associated with a higher prevalence (OR 5.20, 95% CI 3.14-8.61) and incidence (HR 3.29, 95% CI 2.30-4.71) of CKD than non-advanced fibrosis. In all analyses, the magnitude and direction of effects remained unaffected by diabetes status, after adjustment for other risk factors, and in other subgroup and meta-regression analyses. In cross-sectional and longitudinal studies, the severity of NAFLD was positively associated with CKD stages. Limitations of analysis are the relatively small size of studies utilizing liver histology and the suboptimal sensitivity of ultrasound and biochemistry for NAFLD detection in population-based studies.
CONCLUSIONS
The presence and severity of NAFLD are associated with an increased risk and severity of CKD. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
May/22/2011
Abstract
BACKGROUND
We have previously shown that multiple genetic loci identified by genome-wide association studies (GWAS) increase the susceptibility to obesity in a cumulative manner. It is, however, not known whether and to what extent this genetic susceptibility may be attenuated by a physically active lifestyle. We aimed to assess the influence of a physically active lifestyle on the genetic predisposition to obesity in a large population-based study.
RESULTS
We genotyped 12 SNPs in obesity-susceptibility loci in a population-based sample of 20,430 individuals (aged 39-79 y) from the European Prospective Investigation of Cancer (EPIC)-Norfolk cohort with an average follow-up period of 3.6 y. A genetic predisposition score was calculated for each individual by adding the body mass index (BMI)-increasing alleles across the 12 SNPs. Physical activity was assessed using a self-administered questionnaire. Linear and logistic regression models were used to examine main effects of the genetic predisposition score and its interaction with physical activity on BMI/obesity risk and BMI change over time, assuming an additive effect for each additional BMI-increasing allele carried. Each additional BMI-increasing allele was associated with 0.154 (standard error [SE] 0.012) kg/m(2) (p = 6.73 x 10(-37)) increase in BMI (equivalent to 445 g in body weight for a person 1.70 m tall). This association was significantly (p(interaction) = 0.005) more pronounced in inactive people (0.205 [SE 0.024] kg/m(2) [p = 3.62 x 10(-18); 592 g in weight]) than in active people (0.131 [SE 0.014] kg/m(2) [p = 7.97 x 10(-21); 379 g in weight]). Similarly, each additional BMI-increasing allele increased the risk of obesity 1.116-fold (95% confidence interval [CI] 1.093-1.139, p = 3.37 x 10(-26)) in the whole population, but significantly (p(interaction) = 0.015) more in inactive individuals (odds ratio [OR] = 1.158 [95% CI 1.118-1.199; p = 1.93 x 10(-16)]) than in active individuals (OR = 1.095 (95% CI 1.068-1.123; p = 1.15 x 10(-12)]). Consistent with the cross-sectional observations, physical activity modified the association between the genetic predisposition score and change in BMI during follow-up (p(interaction) = 0.028).
CONCLUSIONS
Our study shows that living a physically active lifestyle is associated with a 40% reduction in the genetic predisposition to common obesity, as estimated by the number of risk alleles carried for any of the 12 recently GWAS-identified loci. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
March/14/2011
Abstract
BACKGROUND
Recruitment of participants into randomised controlled trials (RCTs) is critical for successful trial conduct. Although there have been two previous systematic reviews on related topics, the results (which identified specific interventions) were inconclusive and not generalizable. The aim of our study was to evaluate the relative effectiveness of recruitment strategies for participation in RCTs.
RESULTS
A systematic review, using the PRISMA guideline for reporting of systematic reviews, that compared methods of recruiting individual study participants into an actual or mock RCT were included. We searched MEDLINE, Embase, The Cochrane Library, and reference lists of relevant studies. From over 16,000 titles or abstracts reviewed, 396 papers were retrieved and 37 studies were included, in which 18,812 of at least 59,354 people approached agreed to participate in a clinical RCT. Recruitment strategies were broadly divided into four groups: novel trial designs (eight studies), recruiter differences (eight studies), incentives (two studies), and provision of trial information (19 studies). Strategies that increased people's awareness of the health problem being studied (e.g., an interactive computer program [relative risk (RR) 1.48, 95% confidence interval (CI) 1.00-2.18], attendance at an education session [RR 1.14, 95% CI 1.01-1.28], addition of a health questionnaire [RR 1.37, 95% CI 1.14-1.66]), or a video about the health condition (RR 1.75, 95% CI 1.11-2.74), and also monetary incentives (RR1.39, 95% CI 1.13-1.64 to RR 1.53, 95% CI 1.28-1.84) improved recruitment. Increasing patients' understanding of the trial process, recruiter differences, and various methods of randomisation and consent design did not show a difference in recruitment. Consent rates were also higher for nonblinded trial design, but differential loss to follow up between groups may jeopardise the study findings. The study's main limitation was the necessity of modifying the search strategy with subsequent search updates because of changes in MEDLINE definitions. The abstracts of previous versions of this systematic review were published in 2002 and 2007.
CONCLUSIONS
Recruitment strategies that focus on increasing potential participants' awareness of the health problem being studied, its potential impact on their health, and their engagement in the learning process appeared to increase recruitment to clinical studies. Further trials of recruitment strategies that target engaging participants to increase their awareness of the health problems being studied and the potential impact on their health may confirm this hypothesis. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
September/14/2011
Abstract
BACKGROUND
We investigated the effect of the 7-valent pneumococcal conjugate vaccine (PCV7) programme in England on serotype-specific carriage and invasive disease to help understand its role in serotype replacement and predict the impact of higher valency vaccines.
RESULTS
Nasopharyngeal swabs were taken from children <5 y old and family members (n=400) 2 y after introduction of PCV7 into routine immunization programs. Proportions carrying Streptococcus pneumoniae and serotype distribution among carried isolates were compared with a similar population prior to PCV7 introduction. Serotype-specific case carrier ratios (CCRs) were estimated using national data on invasive disease. In vaccinated children and their contacts vaccine-type (VT) carriage decreased, but was offset by an increase in non-VT carriage, with no significant overall change in carriage prevalence, odds ratio 1.06 (95% confidence interval 0.76-1.49). The lower CCRs of the replacing serotypes resulted in a net reduction in invasive disease in children. The additional serotypes covered by higher valency vaccines had low carriage but high disease prevalence. Serotype 11C emerged as predominant in carriage but caused no invasive disease whereas 8, 12F, and 22F emerged in disease but had very low carriage prevalence.
CONCLUSIONS
Because the additional serotypes included in PCV10/13 have high CCRs but low carriage prevalence, vaccinating against them is likely to significantly reduce invasive disease with less risk of serotype replacement. However, a few serotypes with high CCRs could mitigate the benefits of higher valency vaccines. Assessment of the effect of PCV on carriage as well as invasive disease should be part of enhanced surveillance activities for PCVs. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
September/16/2009
Abstract
BACKGROUND
The antibody response to HIV-1 does not appear in the plasma until approximately 2-5 weeks after transmission, and neutralizing antibodies to autologous HIV-1 generally do not become detectable until 12 weeks or more after transmission. Moreover, levels of HIV-1-specific antibodies decline on antiretroviral treatment. The mechanisms of this delay in the appearance of anti-HIV-1 antibodies and of their subsequent rapid decline are not known. While the effect of HIV-1 on depletion of gut CD4(+) T cells in acute HIV-1 infection is well described, we studied blood and tissue B cells soon after infection to determine the effect of early HIV-1 on these cells.
RESULTS
In human participants, we analyzed B cells in blood as early as 17 days after HIV-1 infection, and in terminal ileum inductive and effector microenvironments beginning at 47 days after infection. We found that HIV-1 infection rapidly induced polyclonal activation and terminal differentiation of B cells in blood and in gut-associated lymphoid tissue (GALT) B cells. The specificities of antibodies produced by GALT memory B cells in acute HIV-1 infection (AHI) included not only HIV-1-specific antibodies, but also influenza-specific and autoreactive antibodies, indicating very early onset of HIV-1-induced polyclonal B cell activation. Follicular damage or germinal center loss in terminal ileum Peyer's patches was seen with 88% of follicles exhibiting B or T cell apoptosis and follicular lysis.
CONCLUSIONS
Early induction of polyclonal B cell differentiation, coupled with follicular damage and germinal center loss soon after HIV-1 infection, may explain both the high rate of decline in HIV-1-induced antibody responses and the delay in plasma antibody responses to HIV-1. Please see later in the article for Editors' Summary.
Publication
Journal: PLoS Medicine
September/1/2011
Abstract
BACKGROUND
Mexico's local and national authorities initiated an intense public health response during the early stages of the 2009 A/H1N1 pandemic. In this study we analyzed the epidemiological patterns of the pandemic during April-December 2009 in Mexico and evaluated the impact of nonmedical interventions, school cycles, and demographic factors on influenza transmission.
RESULTS
We used influenza surveillance data compiled by the Mexican Institute for Social Security, representing 40% of the population, to study patterns in influenza-like illness (ILIs) hospitalizations, deaths, and case-fatality rate by pandemic wave and geographical region. We also estimated the reproduction number (R) on the basis of the growth rate of daily cases, and used a transmission model to evaluate the effectiveness of mitigation strategies initiated during the spring pandemic wave. A total of 117,626 ILI cases were identified during April-December 2009, of which 30.6% were tested for influenza, and 23.3% were positive for the influenza A/H1N1 pandemic virus. A three-wave pandemic profile was identified, with an initial wave in April-May (Mexico City area), a second wave in June-July (southeastern states), and a geographically widespread third wave in August-December. The median age of laboratory confirmed ILI cases was ∼ 18 years overall and increased to ∼ 31 years during autumn (p<0.0001). The case-fatality ratio among ILI cases was 1.2% overall, and highest (5.5%) among people over 60 years. The regional R estimates were 1.8-2.1, 1.6-1.9, and 1.2-1.3 for the spring, summer, and fall waves, respectively. We estimate that the 18-day period of mandatory school closures and other social distancing measures implemented in the greater Mexico City area was associated with a 29%-37% reduction in influenza transmission in spring 2009. In addition, an increase in R was observed in late May and early June in the southeast states, after mandatory school suspension resumed and before summer vacation started. State-specific fall pandemic waves began 2-5 weeks after school reopened for the fall term, coinciding with an age shift in influenza cases.
CONCLUSIONS
We documented three spatially heterogeneous waves of the 2009 A/H1N1 pandemic virus in Mexico, which were characterized by a relatively young age distribution of cases. Our study highlights the importance of school cycles on the transmission dynamics of this pandemic influenza strain and suggests that school closure and other mitigation measures could be useful to mitigate future influenza pandemics. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
July/14/2013
Abstract
BACKGROUND
The World Health Organization initiative to eliminate mother-to-child transmission of syphilis aims for ≥ 90% of pregnant women to be tested for syphilis and ≥ 90% to receive treatment by 2015. We calculated global and regional estimates of syphilis in pregnancy and associated adverse outcomes for 2008, as well as antenatal care (ANC) coverage for women with syphilis.
RESULTS
Estimates were based upon a health service delivery model. National syphilis seropositivity data from 97 of 193 countries and ANC coverage from 147 countries were obtained from World Health Organization databases. Proportions of adverse outcomes and effectiveness of screening and treatment were from published literature. Regional estimates of ANC syphilis testing and treatment were examined through sensitivity analysis. In 2008, approximately 1.36 million (range: 1.16 to 1.56 million) pregnant women globally were estimated to have probable active syphilis; of these, 80% had attended ANC. Globally, 520,905 (best case: 425,847; worst case: 615,963) adverse outcomes were estimated to be caused by maternal syphilis, including approximately 212,327 (174,938; 249,716) stillbirths (>28 wk) or early fetal deaths (22 to 28 wk), 91,764 (76,141; 107,397) neonatal deaths, 65,267 (56,929; 73,605) preterm or low birth weight infants, and 151,547 (117,848; 185,245) infected newborns. Approximately 66% of adverse outcomes occurred in ANC attendees who were not tested or were not treated for syphilis. In 2008, based on the middle case scenario, clinical services likely averted 26% of all adverse outcomes. Limitations include missing syphilis seropositivity data for many countries in Europe, the Mediterranean, and North America, and use of estimates for the proportion of syphilis that was "probable active," and for testing and treatment coverage.
CONCLUSIONS
Syphilis continues to affect large numbers of pregnant women, causing substantial perinatal morbidity and mortality that could be prevented by early testing and treatment. In this analysis, most adverse outcomes occurred among women who attended ANC but were not tested or treated for syphilis, highlighting the need to improve the quality of ANC as well as ANC coverage. In addition, improved ANC data on syphilis testing coverage, positivity, and treatment are needed. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
May/8/2014
Abstract
BACKGROUND
A major impediment to tuberculosis control in Africa is the difficulty in diagnosing active tuberculosis (TB), particularly in the context of HIV infection. We hypothesized that a unique host blood RNA transcriptional signature would distinguish TB from other diseases (OD) in HIV-infected and -uninfected patients, and that this could be the basis of a simple diagnostic test.
RESULTS
Adult case-control cohorts were established in South Africa and Malawi of HIV-infected or -uninfected individuals consisting of 584 patients with either TB (confirmed by culture of Mycobacterium tuberculosis [M.TB] from sputum or tissue sample in a patient under investigation for TB), OD (i.e., TB was considered in the differential diagnosis but then excluded), or healthy individuals with latent TB infection (LTBI). Individuals were randomized into training (80%) and test (20%) cohorts. Blood transcriptional profiles were assessed and minimal sets of significantly differentially expressed transcripts distinguishing TB from LTBI and OD were identified in the training cohort. A 27 transcript signature distinguished TB from LTBI and a 44 transcript signature distinguished TB from OD. To evaluate our signatures, we used a novel computational method to calculate a disease risk score (DRS) for each patient. The classification based on this score was first evaluated in the test cohort, and then validated in an independent publically available dataset (GSE19491). In our test cohort, the DRS classified TB from LTBI (sensitivity 95%, 95% CI [87-100]; specificity 90%, 95% CI [80-97]) and TB from OD (sensitivity 93%, 95% CI [83-100]; specificity 88%, 95% CI [74-97]). In the independent validation cohort, TB patients were distinguished both from LTBI individuals (sensitivity 95%, 95% CI [85-100]; specificity 94%, 95% CI [84-100]) and OD patients (sensitivity 100%, 95% CI [100-100]; specificity 96%, 95% CI [93-100]). Limitations of our study include the use of only culture confirmed TB patients, and the potential that TB may have been misdiagnosed in a small proportion of OD patients despite the extensive clinical investigation used to assign each patient to their diagnostic group.
CONCLUSIONS
In our study, blood transcriptional signatures distinguished TB from other conditions prevalent in HIV-infected and -uninfected African adults. Our DRS, based on these signatures, could be developed as a test for TB suitable for use in HIV endemic countries. Further evaluation of the performance of the signatures and DRS in prospective populations of patients with symptoms consistent with TB will be needed to define their clinical value under operational conditions. Please see later in the article for the Editors' Summary.
Publication
Journal: Diabetes Care
December/9/2020
Abstract
The American Diabetes Association (ADA) "Standards of Medical Care in Diabetes" includes the ADA's current clinical practice recommendations and is intended to provide the components of diabetes care, general treatment goals and guidelines, and tools to evaluate quality of care. Members of the ADA Professional Practice Committee, a multidisciplinary expert committee (https://doi.org/10.2337/dc21-SPPC), are responsible for updating the Standards of Care annually, or more frequently as warranted. For a detailed description of ADA standards, statements, and reports, as well as the evidence-grading system for ADA's clinical practice recommendations, please refer to the Standards of Care Introduction (https://doi.org/10.2337/dc21-SINT). Readers who wish to comment on the Standards of Care are invited to do so at professional.diabetes.org/SOC.
Related with
Publication
Journal: PLoS Medicine
September/25/2013
Abstract
BACKGROUND
The Shoklo Malaria Research Unit has been working on the Thai-Myanmar border for 25 y providing early diagnosis and treatment (EDT) of malaria. Transmission of Plasmodium falciparum has declined, but resistance to artesunate has emerged. We expanded malaria activities through EDT and evaluated the impact over a 12-y period.
RESULTS
Between 1 October 1999 and 30 September 2011, the Shoklo Malaria Research Unit increased the number of cross-border (Myanmar side) health facilities from two to 11 and recorded the number of malaria consultations. Changes in malaria incidence were estimated from a cohort of pregnant women, and prevalence from cross-sectional surveys. In vivo and in vitro antimalarial drug efficacy were monitored. Over this period, the number of malaria cases detected increased initially, but then declined rapidly. In children under 5 y, the percentage of consultations due to malaria declined from 78% (95% CI 76-80) (1,048/1,344 consultations) to 7% (95% CI 6.2-7.1) (767/11,542 consultations), p<0.001. The ratio of P. falciparum/P. vivax declined from 1.4 (95% CI 1.3-1.4) to 0.7 (95% CI 0.7-0.8). The case fatality rate was low (39/75,126; 0.05% [95% CI 0.04-0.07]). The incidence of malaria declined from 1.1 to 0.1 episodes per pregnant women-year. The cumulative proportion of P. falciparum decreased significantly from 24.3% (95% CI 21.0-28.0) (143/588 pregnant women) to 3.4% (95% CI 2.8-4.3) (76/2,207 pregnant women), p<0.001. The in vivo efficacy of mefloquine-artesunate declined steadily, with a sharp drop in 2011 (day-42 PCR-adjusted cure rate 42% [95% CI 20-62]). The proportion of patients still slide positive for malaria at day 3 rose from 0% in 2000 to reach 28% (95% CI 13-45) (8/29 patients) in 2011.
CONCLUSIONS
Despite the emergence of resistance to artesunate in P. falciparum, the strategy of EDT with artemisinin-based combination treatments has been associated with a reduction in malaria in the migrant population living on the Thai-Myanmar border. Although limited by its observational nature, this study provides useful data on malaria burden in a strategically crucial geographical area. Alternative fixed combination treatments are needed urgently to replace the failing first-line regimen of mefloquine and artesunate. Please see later in the article for the Editors' Summary.
Publication
Journal: Journal of Urology
November/12/2013
Abstract
OBJECTIVE
The purpose of this guideline is to provide a clinical framework for the use of radiotherapy after radical prostatectomy as adjuvant or salvage therapy.
METHODS
A systematic literature review using the PubMed®, Embase, and Cochrane databases was conducted to identify peer-reviewed publications relevant to the use of radiotherapy after prostatectomy. The review yielded 294 articles; these publications were used to create the evidence-based guideline statements. Additional guidance is provided as Clinical Principles when insufficient evidence existed.
RESULTS
Guideline statements are provided for patient counseling, the use of radiotherapy in the adjuvant and salvage contexts, defining biochemical recurrence, and conducting a re-staging evaluation.
CONCLUSIONS
Physicians should offer adjuvant radiotherapy to patients with adverse pathologic findings at prostatectomy (i.e., seminal vesicle invasion, positive surgical margins, extraprostatic extension) and should offer salvage radiotherapy to patients with prostatic specific antigen or local recurrence after prostatectomy in whom there is no evidence of distant metastatic disease. The offer of radiotherapy should be made in the context of a thoughtful discussion of possible short- and long-term side effects of radiotherapy as well as the potential benefits of preventing recurrence. The decision to administer radiotherapy should be made by the patient and the multi-disciplinary treatment team with full consideration of the patient's history, values, preferences, quality of life, and functional status. Please visit the ASTRO and AUA websites (http://www.redjournal.org/webfiles/images/journals/rob/RAP%20Guideline.pdf and http://www.auanet.org/education/guidelines/radiation-after-prostatectomy.cfm) to view this guideline in its entirety, including the full literature review.
Publication
Journal: PLoS Medicine
February/16/2010
Abstract
BACKGROUND
Hepatitis C virus (HCV) is estimated to affect 130-180 million people worldwide. Although its origin is unknown, patterns of viral diversity suggest that HCV genotype 1 probably originated from West Africa. Previous attempts to estimate the spatiotemporal parameters of the virus, both globally and regionally, have suggested that epidemic HCV transmission began in 1900 and grew steadily until the late 1980s. However, epidemiological data suggest that the expansion of HCV may have occurred after the Second World War. The aim of our study was to elucidate the timescale and route of the global spread of HCV.
RESULTS
We show that the rarely sequenced HCV region (E2P7NS2) is more informative for molecular epidemiology studies than the more commonly used NS5B region. We applied phylodynamic methods to a substantial set of new E2P7NS2 and NS5B sequences, together with all available global HCV sequences with information in both of these genomic regions, in order to estimate the timescale and nature of the global expansion of the most prevalent HCV subtypes, 1a and 1b. We showed that transmission of subtypes 1a and 1b "exploded" between 1940 and 1980, with the spread of 1b preceding that of 1a by at least 16 y (95% confidence interval 15-17). Phylogeographic analysis of all available NS5B sequences suggests that HCV subtypes 1a and 1b disseminated from the developed world to the developing countries.
CONCLUSIONS
The evolutionary rate of HCV appears faster than previously suggested. The global spread of HCV coincided with the widespread use of transfused blood and blood products and with the expansion of intravenous drug use but slowed prior to the wide implementation of anti-HCV screening. Differences in the transmission routes associated with subtypes 1a and 1b provide an explanation of the relatively earlier expansion of 1b. Our data show that the most plausible route of the HCV dispersal was from developed countries to the developing world. Please see later in the article for the Editors' Summary.
Publication
Journal: Neuropsychologia
March/18/2003
Abstract
Previous research has shown that negative stimuli elicit more attention than do positive stimuli. However, this research has relied on response-based measures to assess attention. The current research uses the P1 component of the event-related brain potential (ERP) as a proximal index of attention allocation to valenced stimuli. In two studies, P1 amplitude was measured while participants evaluated positive and negative pictures. In both studies, principal components analysis showed that P1 amplitudes to frequent stimuli and to rare negative stimuli were larger than P1 amplitudes to rare positive stimuli. This is (a) evidence for the extremely rapid (<120 ms) differentiation of positive and negative stimuli and (b) process-based evidence for a negativity bias in attention allocation.
Publication
Journal: PLoS Medicine
February/3/2013
Abstract
BACKGROUND
Increased mortality among men on antiretroviral therapy (ART) has been documented but remains poorly understood. We examined the magnitude of and risk factors for gender differences in mortality on ART.
RESULTS
Analyses included 46,201 ART-naïve adults starting ART between January 2002 and December 2009 in eight ART programmes across South Africa (SA). Patients were followed from initiation of ART to outcome or analysis closure. The primary outcome was mortality; secondary outcomes were loss to follow-up (LTF), virologic suppression, and CD4+ cell count responses. Survival analyses were used to examine the hazard of death on ART by gender. Sensitivity analyses were limited to patients who were virologically suppressed and patients whose CD4+ cell count reached >200 cells/µl. We compared gender differences in mortality among HIV+ patients on ART with mortality in an age-standardised HIV-negative population. Among 46,201 adults (65% female, median age 35 years), during 77,578 person-years of follow-up, men had lower median CD4+ cell counts than women (85 versus 110 cells/µl, p<0.001), were more likely to be classified WHO stage III/IV (86 versus 77%, p<0.001), and had higher mortality in crude (8.5 versus 5.7 deaths/100 person-years, p<0.001) and adjusted analyses (adjusted hazard ratio [AHR] 1.31, 95% CI 1.22-1.41). After 36 months on ART, men were more likely than women to be truly LTF (AHR 1.20, 95% CI 1.12-1.28) but not to die after LTF (AHR 1.04, 95% CI 0.86-1.25). Findings were consistent across all eight programmes. Virologic suppression was similar by gender; women had slightly better immunologic responses than men. Notably, the observed gender differences in mortality on ART were smaller than gender differences in age-standardised death rates in the HIV-negative South African population. Over time, non-HIV mortality appeared to account for an increasing proportion of observed mortality. The analysis was limited by missing data on baseline HIV disease characteristics, and we did not observe directly mortality in HIV-negative populations where the participating cohorts were located.
CONCLUSIONS
HIV-infected men have higher mortality on ART than women in South African programmes, but these differences are only partly explained by more advanced HIV disease at the time of ART initiation, differential LTF and subsequent mortality, and differences in responses to treatment. The observed differences in mortality on ART may be best explained by background differences in mortality between men and women in the South African population unrelated to the HIV/AIDS epidemic. Please see later in the article for the Editors' Summary.
Publication
Journal: BMC Medicine
November/6/2011
Abstract
BACKGROUND
The spread of infectious diseases crucially depends on the pattern of contacts between individuals. Knowledge of these patterns is thus essential to inform models and computational efforts. However, there are few empirical studies available that provide estimates of the number and duration of contacts between social groups. Moreover, their space and time resolutions are limited, so that data are not explicit at the person-to-person level, and the dynamic nature of the contacts is disregarded. In this study, we aimed to assess the role of data-driven dynamic contact patterns between individuals, and in particular of their temporal aspects, in shaping the spread of a simulated epidemic in the population.
METHODS
We considered high-resolution data about face-to-face interactions between the attendees at a conference, obtained from the deployment of an infrastructure based on radiofrequency identification (RFID) devices that assessed mutual face-to-face proximity. The spread of epidemics along these interactions was simulated using an SEIR (Susceptible, Exposed, Infectious, Recovered) model, using both the dynamic network of contacts defined by the collected data, and two aggregated versions of such networks, to assess the role of the data temporal aspects.
RESULTS
We show that, on the timescales considered, an aggregated network taking into account the daily duration of contacts is a good approximation to the full resolution network, whereas a homogeneous representation that retains only the topology of the contact network fails to reproduce the size of the epidemic.
CONCLUSIONS
These results have important implications for understanding the level of detail needed to correctly inform computational models for the study and management of real epidemics. Please see related article BMC Medicine, 2011, 9:88.
Publication
Journal: Annual Review of Virology
March/20/2020
Abstract
The seasonal cycle of respiratory viral diseases has been widely recognized for thousands of years, as annual epidemics of the common cold and influenza disease hit the human population like clockwork in the winter season in temperate regions. Moreover, epidemics caused by viruses such as severe acute respiratory syndrome coronavirus (SARS-CoV) and the newly emerging SARS-CoV-2 occur during the winter months. The mechanisms underlying the seasonal nature of respiratory viral infections have been examined and debated for many years. The two major contributing factors are the changes in environmental parameters and human behavior. Studies have revealed the effect of temperature and humidity on respiratory virus stability and transmission rates. More recent research highlights the importance of the environmental factors, especially temperature and humidity, in modulating host intrinsic, innate, and adaptive immune responses to viral infections in the respiratory tract. Here we review evidence of how outdoor and indoor climates are linked to the seasonality of viral respiratory infections. We further discuss determinants of host response in the seasonality of respiratory viruses by highlighting recent studies in the field. Expected final online publication date for the Annual Review of Virology, Volume 7 is September 29, 2020. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.
Publication
Journal: PLoS Medicine
October/14/2015
Abstract
BACKGROUND
Poor sanitation is thought to be a major cause of enteric infections among young children. However, there are no previously published randomized trials to measure the health impacts of large-scale sanitation programs. India's Total Sanitation Campaign (TSC) is one such program that seeks to end the practice of open defecation by changing social norms and behaviors, and providing technical support and financial subsidies. The objective of this study was to measure the effect of the TSC implemented with capacity building support from the World Bank's Water and Sanitation Program in Madhya Pradesh on availability of individual household latrines (IHLs), defecation behaviors, and child health (diarrhea, highly credible gastrointestinal illness [HCGI], parasitic infections, anemia, growth).
RESULTS
We conducted a cluster-randomized, controlled trial in 80 rural villages. Field staff collected baseline measures of sanitation conditions, behaviors, and child health (May-July 2009), and revisited households 21 months later (February-April 2011) after the program was delivered. The study enrolled a random sample of 5,209 children <5 years old from 3,039 households that had at least one child <24 months at the beginning of the study. A random subsample of 1,150 children <24 months at enrollment were tested for soil transmitted helminth and protozoan infections in stool. The randomization successfully balanced intervention and control groups, and we estimated differences between groups in an intention to treat analysis. The intervention increased percentage of households in a village with improved sanitation facilities as defined by the WHO/UNICEF Joint Monitoring Programme by an average of 19% (95% CI for difference: 12%-26%; group means: 22% control versus 41% intervention), decreased open defecation among adults by an average of 10% (95% CI for difference: 4%-15%; group means: 73% intervention versus 84% control). However, the intervention did not improve child health measured in terms of multiple health outcomes (diarrhea, HCGI, helminth infections, anemia, growth). Limitations of the study included a relatively short follow-up period following implementation, evidence for contamination in ten of the 40 control villages, and bias possible in self-reported outcomes for diarrhea, HCGI, and open defecation behaviors.
CONCLUSIONS
The intervention led to modest increases in availability of IHLs and even more modest reductions in open defecation. These improvements were insufficient to improve child health outcomes (diarrhea, HCGI, parasite infection, anemia, growth). The results underscore the difficulty of achieving adequately large improvements in sanitation levels to deliver expected health benefits within large-scale rural sanitation programs.
BACKGROUND
ClinicalTrials.gov NCT01465204. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
February/21/2010
Abstract
BACKGROUND
Prospective studies have indicated that elevated blood glucose levels may be linked with increased cancer risk, but the strength of the association is unclear. We examined the association between blood glucose and cancer risk in a prospective study of six European cohorts.
RESULTS
The Metabolic syndrome and Cancer project (Me-Can) includes cohorts from Norway, Austria, and Sweden; the current study included 274,126 men and 275,818 women. Mean age at baseline was 44.8 years and mean follow-up time was 10.4 years. Excluding the first year of follow-up, 18,621 men and 11,664 women were diagnosed with cancer, and 6,973 men and 3,088 women died of cancer. We used Cox regression models to calculate relative risk (RR) for glucose levels, and included adjustment for body mass index (BMI) and smoking status in the analyses. RRs were corrected for regression dilution ratio of glucose. RR (95% confidence interval) per 1 mmol/l increment of glucose for overall incident cancer was 1.05 (1.01-1.10) in men and 1.11 (1.05-1.16) in women, and corresponding RRs for fatal cancer were 1.15 (1.07-1.22) and 1.21 (1.11-1.33), respectively. Significant increases in risk among men were found for incident and fatal cancer of the liver, gallbladder, and respiratory tract, for incident thyroid cancer and multiple myeloma, and for fatal rectal cancer. In women, significant associations were found for incident and fatal cancer of the pancreas, for incident urinary bladder cancer, and for fatal cancer of the uterine corpus, cervix uteri, and stomach.
CONCLUSIONS
Data from our study indicate that abnormal glucose metabolism, independent of BMI, is associated with an increased risk of cancer overall and at several cancer sites. Our data showed stronger associations among women than among men, and for fatal cancer compared to incident cancer. Please see later in the article for the Editors' Summary.
Publication
Journal: PLoS Medicine
February/13/2012
Abstract
BACKGROUND
Although patient attrition is recognized as a threat to the long-term success of antiretroviral therapy programs worldwide, there is no universal definition for classifying patients as lost to follow-up (LTFU). We analyzed data from health facilities across Africa, Asia, and Latin America to empirically determine a standard LTFU definition.
RESULTS
At a set "status classification" date, patients were categorized as either "active" or "LTFU" according to different intervals from time of last clinic encounter. For each threshold, we looked forward 365 d to assess the performance and accuracy of this initial classification. The best-performing definition for LTFU had the lowest proportion of patients misclassified as active or LTFU. Observational data from 111 health facilities-representing 180,718 patients from 19 countries-were included in this study. In the primary analysis, for which data from all facilities were pooled, an interval of 180 d (95% confidence interval [CI]: 173-181 d) since last patient encounter resulted in the fewest misclassifications (7.7%, 95% CI: 7.6%-7.8%). A secondary analysis that gave equal weight to cohorts and to regions generated a similar result (175 d); however, an alternate approach that used inverse weighting for cohorts based on variance and equal weighting for regions produced a slightly lower summary measure (150 d). When examined at the facility level, the best-performing definition varied from 58 to 383 d (mean=150 d), but when a standard definition of 180 d was applied to each facility, only slight increases in misclassification (mean=1.2%, 95% CI: 1.0%-1.5%) were observed. Using this definition, the proportion of patients classified as LTFU by facility ranged from 3.1% to 45.1% (mean=19.9%, 95% CI: 19.1%-21.7%).
CONCLUSIONS
Based on this evaluation, we recommend the adoption of ≥180 d since the last clinic visit as a standard LTFU definition. Such standardization is an important step to understanding the reasons that underlie patient attrition and establishing more reliable and comparable program evaluation worldwide. Please see later in the article for the Editors' Summary.
Publication
Journal: Diabetes Care
December/18/2018
Abstract
The American Diabetes Association (ADA) "Standards of Medical Care in Diabetes" includes ADA's current clinical practice recommendations and is intended to provide the components of diabetes care, general treatment goals and guidelines, and tools to evaluate quality of care. Members of the ADA Professional Practice Committee, a multidisciplinary expert committee, are responsible for updating the Standards of Care annually, or more frequently as warranted. For a detailed description of ADA standards, statements, and reports, as well as the evidence-grading system for ADA's clinical practice recommendations, please refer to the Standards of Care Introduction Readers who wish to comment on the Standards of Care are invited to do so at professional.diabetes.org/SOC.
Publication
Journal: Diabetes Care
April/4/2018
Abstract
The American Diabetes Association (ADA) "Standards of Medical Care in Diabetes" includes ADA's current clinical practice recommendations and is intended to provide the components of diabetes care, general treatment goals and guidelines, and tools to evaluate quality of care. Members of the ADA Professional Practice Committee, a multidisciplinary expert committee, are responsible for updating the Standards of Care annually, or more frequently as warranted. For a detailed description of ADA standards, statements, and reports, as well as the evidence-grading system for ADA's clinical practice recommendations, please refer to the Standards of Care Introduction Readers who wish to comment on the Standards of Care are invited to do so at professional.diabetes.org/SOC.
Authors
Publication
Journal: PLoS Medicine
March/7/2012
Abstract
BACKGROUND
Antiretrovirals have substantial promise for HIV-1 prevention, either as antiretroviral treatment (ART) for HIV-1-infected persons to reduce infectiousness, or as pre-exposure prophylaxis (PrEP) for HIV-1-uninfected persons to reduce the possibility of infection with HIV-1. HIV-1 serodiscordant couples in long-term partnerships (one member is infected and the other is uninfected) are a priority for prevention interventions. Earlier ART and PrEP might both reduce HIV-1 transmission in this group, but the merits and synergies of these different approaches have not been analyzed.
RESULTS
We constructed a mathematical model to examine the impact and cost-effectiveness of different strategies, including earlier initiation of ART and/or PrEP, for HIV-1 prevention for serodiscordant couples. Although the cost of PrEP is high, the cost per infection averted is significantly offset by future savings in lifelong treatment, especially among couples with multiple partners, low condom use, and a high risk of transmission. In some situations, highly effective PrEP could be cost-saving overall. To keep couples alive and without a new infection, providing PrEP to the uninfected partner could be at least as cost-effective as initiating ART earlier in the infected partner, if the annual cost of PrEP is <40% of the annual cost of ART and PrEP is >70% effective.
CONCLUSIONS
Strategic use of PrEP and ART could substantially and cost-effectively reduce HIV-1 transmission in HIV-1 serodiscordant couples. New and forthcoming data on the efficacy of PrEP, the cost of delivery of ART and PrEP, and couples behaviours and preferences will be critical for optimizing the use of antiretrovirals for HIV-1 prevention. Please see later in the article for the Editors' Summary.
load more...