Citations
All
Search in:AllTitleAbstractAuthor name
Publications
(511K+)
Patents
Grants
Pathways
Clinical trials
The language you are using is not recognised as English. To correctly search in your language please select Search and translation language
Publication
Journal: Genome Research
December/21/1997
Abstract
Genes differentially expressed in different tissues, during development, or during specific pathologies are of foremost interest to both basic and pharmaceutical research. "Transcript profiles" or "digital Northerns" are generated routinely by partially sequencing thousands of randomly selected clones from relevant cDNA libraries. Differentially expressed genes can then be detected from variations in the counts of their cognate sequence tags. Here we present the first systematic study on the influence of random fluctuations and sampling size on the reliability of this kind of data. We establish a rigorous significance test and demonstrate its use on publicly available transcript profiles. The theory links the threshold of selection of putatively regulated genes (e.g., the number of pharmaceutical leads) to the fraction of false positive clones one is willing to risk. Our results delineate more precisely and extend the limits within which digital Northern data can be used.
Publication
Journal: American Journal of Public Health
September/21/1999
Abstract
Progress in public health and community-based interventions has been hampered by the lack of a comprehensive evaluation framework appropriate to such programs. Multilevel interventions that incorporate policy, environmental, and individual components should be evaluated with measurements suited to their settings, goals, and purpose. In this commentary, the authors propose a model (termed the RE-AIM model) for evaluating public health interventions that assesses 5 dimensions: reach, efficacy, adoption, implementation, and maintenance. These dimensions occur at multiple levels (e.g., individual, clinic or organization, community) and interact to determine the public health or population-based impact of a program or policy. The authors discuss issues in evaluating each of these dimensions and combining them to determine overall public health impact. Failure to adequately evaluate programs on all 5 dimensions can lead to a waste of resources, discontinuities between stages of research, and failure to improve public health to the limits of our capacity. The authors summarize strengths and limitations of the RE-AIM model and recommend areas for future research and application.
Publication
Journal: Nature
June/20/2005
Abstract
MicroRNAs (miRNAs) are 21-23 nucleotide RNA molecules that regulate the stability or translational efficiency of target messenger RNAs. miRNAs have diverse functions, including the regulation of cellular differentiation, proliferation and apoptosis. Although strict tissue- and developmental-stage-specific expression is critical for appropriate miRNA function, mammalian transcription factors that regulate miRNAs have not yet been identified. The proto-oncogene c-MYC encodes a transcription factor that regulates cell proliferation, growth and apoptosis. Dysregulated expression or function of c-Myc is one of the most common abnormalities in human malignancy. Here we show that c-Myc activates expression of a cluster of six miRNAs on human chromosome 13. Chromatin immunoprecipation experiments show that c-Myc binds directly to this locus. The transcription factor E2F1 is an additional target of c-Myc that promotes cell cycle progression. We find that expression of E2F1 is negatively regulated by two miRNAs in this cluster, miR-17-5p and miR-20a. These findings expand the known classes of transcripts within the c-Myc target gene network, and reveal a mechanism through which c-Myc simultaneously activates E2F1 transcription and limits its translation, allowing a tightly controlled proliferative signal.
Publication
Journal: New England Journal of Medicine
July/24/2013
Abstract
BACKGROUND
The programmed death 1 (PD-1) receptor is a negative regulator of T-cell effector mechanisms that limits immune responses against cancer. We tested the anti-PD-1 antibody lambrolizumab (previously known as MK-3475) in patients with advanced melanoma.
METHODS
We administered lambrolizumab intravenously at a dose of 10 mg per kilogram of body weight every 2 or 3 weeks or 2 mg per kilogram every 3 weeks in patients with advanced melanoma, both those who had received prior treatment with the immune checkpoint inhibitor ipilimumab and those who had not. Tumor responses were assessed every 12 weeks.
RESULTS
A total of 135 patients with advanced melanoma were treated. Common adverse events attributed to treatment were fatigue, rash, pruritus, and diarrhea; most of the adverse events were low grade. The confirmed response rate across all dose cohorts, evaluated by central radiologic review according to the Response Evaluation Criteria in Solid Tumors (RECIST), version 1.1, was 38% (95% confidence interval [CI], 25 to 44), with the highest confirmed response rate observed in the cohort that received 10 mg per kilogram every 2 weeks (52%; 95% CI, 38 to 66). The response rate did not differ significantly between patients who had received prior ipilimumab treatment and those who had not (confirmed response rate, 38% [95% CI, 23 to 55] and 37% [95% CI, 26 to 49], respectively). Responses were durable in the majority of patients (median follow-up, 11 months among patients who had a response); 81% of the patients who had a response (42 of 52) were still receiving treatment at the time of analysis in March 2013. The overall median progression-free survival among the 135 patients was longer than 7 months.
CONCLUSIONS
In patients with advanced melanoma, including those who had had disease progression while they had been receiving ipilimumab, treatment with lambrolizumab resulted in a high rate of sustained tumor regression, with mainly grade 1 or 2 toxic effects. (Funded by Merck Sharp and Dohme; ClinicalTrials.gov number, NCT01295827.).
Publication
Journal: Journal of Clinical Epidemiology
January/12/1997
Abstract
We performed a Monte Carlo study to evaluate the effect of the number of events per variable (EPV) analyzed in logistic regression analysis. The simulations were based on data from a cardiac trial of 673 patients in which 252 deaths occurred and seven variables were cogent predictors of mortality; the number of events per predictive variable was (252/7 =) 36 for the full sample. For the simulations, at values of EPV = 2, 5, 10, 15, 20, and 25, we randomly generated 500 samples of the 673 patients, chosen with replacement, according to a logistic model derived from the full sample. Simulation results for the regression coefficients for each variable in each group of 500 samples were compared for bias, precision, and significance testing against the results of the model fitted to the original sample. For EPV values of 10 or greater, no major problems occurred. For EPV values less than 10, however, the regression coefficients were biased in both positive and negative directions; the large sample variance estimates from the logistic model both overestimated and underestimated the sample variance of the regression coefficients; the 90% confidence limits about the estimated values did not have proper coverage; the Wald statistic was conservative under the null hypothesis; and paradoxical associations (significance in the wrong direction) were increased. Although other factors (such as the total number of events, or sample size) may influence the validity of the logistic model, our findings indicate that low EPV can lead to major problems.
Publication
Journal: Clinical Chemistry
May/13/1993
Abstract
The clinical performance of a laboratory test can be described in terms of diagnostic accuracy, or the ability to correctly classify subjects into clinically relevant subgroups. Diagnostic accuracy refers to the quality of the information provided by the classification device and should be distinguished from the usefulness, or actual practical value, of the information. Receiver-operating characteristic (ROC) plots provide a pure index of accuracy by demonstrating the limits of a test's ability to discriminate between alternative states of health over the complete spectrum of operating conditions. Furthermore, ROC plots occupy a central or unifying position in the process of assessing and using diagnostic tools. Once the plot is generated, a user can readily go on to many other activities such as performing quantitative ROC analysis and comparisons of tests, using likelihood ratio to revise the probability of disease in individual subjects, selecting decision thresholds, using logistic-regression analysis, using discriminant-function analysis, or incorporating the tool into a clinical strategy by using decision analysis.
Publication
Journal: The Lancet
April/12/2009
Abstract
BACKGROUND
The main associations of body-mass index (BMI) with overall and cause-specific mortality can best be assessed by long-term prospective follow-up of large numbers of people. The Prospective Studies Collaboration aimed to investigate these associations by sharing data from many studies.
METHODS
Collaborative analyses were undertaken of baseline BMI versus mortality in 57 prospective studies with 894 576 participants, mostly in western Europe and North America (61% [n=541 452] male, mean recruitment age 46 [SD 11] years, median recruitment year 1979 [IQR 1975-85], mean BMI 25 [SD 4] kg/m(2)). The analyses were adjusted for age, sex, smoking status, and study. To limit reverse causality, the first 5 years of follow-up were excluded, leaving 66 552 deaths of known cause during a mean of 8 (SD 6) further years of follow-up (mean age at death 67 [SD 10] years): 30 416 vascular; 2070 diabetic, renal or hepatic; 22 592 neoplastic; 3770 respiratory; 7704 other.
RESULTS
In both sexes, mortality was lowest at about 22.5-25 kg/m(2). Above this range, positive associations were recorded for several specific causes and inverse associations for none, the absolute excess risks for higher BMI and smoking were roughly additive, and each 5 kg/m(2) higher BMI was on average associated with about 30% higher overall mortality (hazard ratio per 5 kg/m(2) [HR] 1.29 [95% CI 1.27-1.32]): 40% for vascular mortality (HR 1.41 [1.37-1.45]); 60-120% for diabetic, renal, and hepatic mortality (HRs 2.16 [1.89-2.46], 1.59 [1.27-1.99], and 1.82 [1.59-2.09], respectively); 10% for neoplastic mortality (HR 1.10 [1.06-1.15]); and 20% for respiratory and for all other mortality (HRs 1.20 [1.07-1.34] and 1.20 [1.16-1.25], respectively). Below the range 22.5-25 kg/m(2), BMI was associated inversely with overall mortality, mainly because of strong inverse associations with respiratory disease and lung cancer. These inverse associations were much stronger for smokers than for non-smokers, despite cigarette consumption per smoker varying little with BMI.
CONCLUSIONS
Although other anthropometric measures (eg, waist circumference, waist-to-hip ratio) could well add extra information to BMI, and BMI to them, BMI is in itself a strong predictor of overall mortality both above and below the apparent optimum of about 22.5-25 kg/m(2). The progressive excess mortality above this range is due mainly to vascular disease and is probably largely causal. At 30-35 kg/m(2), median survival is reduced by 2-4 years; at 40-45 kg/m(2), it is reduced by 8-10 years (which is comparable with the effects of smoking). The definite excess mortality below 22.5 kg/m(2) is due mainly to smoking-related diseases, and is not fully explained.
Publication
Journal: Acta crystallographica. Section D, Biological crystallography
June/18/2012
Abstract
phenix.refine is a program within the PHENIX package that supports crystallographic structure refinement against experimental data with a wide range of upper resolution limits using a large repertoire of model parameterizations. It has several automation features and is also highly flexible. Several hundred parameters enable extensive customizations for complex use cases. Multiple user-defined refinement strategies can be applied to specific parts of the model in a single refinement run. An intuitive graphical user interface is available to guide novice users and to assist advanced users in managing refinement projects. X-ray or neutron diffraction data can be used separately or jointly in refinement. phenix.refine is tightly integrated into the PHENIX suite, where it serves as a critical component in automated model building, final structure refinement, structure validation and deposition to the wwPDB. This paper presents an overview of the major phenix.refine features, with extensive literature references for readers interested in more detailed discussions of the methods.
Publication
Journal: Nature Reviews Cancer
October/6/2002
Abstract
Tissue homeostasis is regulated by apoptosis, the cell-suicide programme that is executed by proteases called caspases. The Bcl2 family of intracellular proteins is the central regulator of caspase activation, and its opposing factions of anti- and pro-apoptotic members arbitrate the life-or-death decision. Apoptosis is often impaired in cancer and can limit conventional therapy. A better understanding of how the Bcl2 family controls caspase activation should result in new, more effective therapeutic approaches.
Pulse
Views:
1
Posts:
No posts
Rating:
Not rated
Publication
Journal: Science
October/25/1989
Abstract
Electrospray ionization has recently emerged as a powerful technique for producing intact ions in vacuo from large and complex species in solution. To an extent greater than has previously been possible with the more familiar "soft" ionization methods, this technique makes the power and elegance of mass spectrometric analysis applicable to the large and fragile polar molecules that play such vital roles in biological systems. The distinguishing features of electrospray spectra for large molecules are coherent sequences of peaks whose component ions are multiply charged, the ions of each peak differing by one charge from those of adjacent neighbors in the sequence. Spectra have been obtained for biopolymers including oligonucleotides and proteins, the latter having molecular weights up to 130,000, with as yet no evidence of an upper limit.
Publication
Journal: Neural Computation
November/29/1995
Abstract
We derive a new self-organizing learning algorithm that maximizes the information transferred in a network of nonlinear units. The algorithm does not assume any knowledge of the input distributions, and is defined here for the zero-noise limit. Under these conditions, information maximization has extra properties not found in the linear case (Linsker 1989). The nonlinearities in the transfer function are able to pick up higher-order moments of the input distributions and perform something akin to true redundancy reduction between units in the output representation. This enables the network to separate statistically independent components in the inputs: a higher-order generalization of principal components analysis. We apply the network to the source separation (or cocktail party) problem, successfully separating unknown mixtures of up to 10 speakers. We also show that a variant on the network architecture is able to perform blind deconvolution (cancellation of unknown echoes and reverberation in a speech signal). Finally, we derive dependencies of information transfer on time delays. We suggest that information maximization provides a unifying framework for problems in "blind" signal processing.
Publication
Journal: Nature Immunology
May/12/2008
Abstract
Natural killer (NK) cells are effector lymphocytes of the innate immune system that control several types of tumors and microbial infections by limiting their spread and subsequent tissue damage. Recent research highlights the fact that NK cells are also regulatory cells engaged in reciprocal interactions with dendritic cells, macrophages, T cells and endothelial cells. NK cells can thus limit or exacerbate immune responses. Although NK cells might appear to be redundant in several conditions of immune challenge in humans, NK cell manipulation seems to hold promise in efforts to improve hematopoietic and solid organ transplantation, promote antitumor immunotherapy and control inflammatory and autoimmune disorders.
Publication
Journal: Medical Care
May/22/2003
Abstract
BACKGROUND
A number of studies have computed the minimally important difference (MID) for health-related quality of life instruments.
OBJECTIVE
To determine whether there is consistency in the magnitude of MID estimates from different instruments.
METHODS
We conducted a systematic review of the literature to identify studies that computed an MID and contained sufficient information to compute an effect size (ES). Thirty-eight studies fulfilled the criteria, resulting in 62 ESs.
RESULTS
For all but 6 studies, the MID estimates were close to one half a SD (mean = 0.495, SD = 0.155). There was no consistent relationship with factors such as disease-specific or generic instrument or the number of response options. Negative changes were not associated with larger ESs. Population-based estimation procedures and brief follow-up were associated with smaller ESs, and acute conditions with larger ESs. An explanation for this consistency is that research in psychology has shown that the limit of people's ability to discriminate over a wide range of tasks is approximately 1 part in 7, which is very close to half a SD.
CONCLUSIONS
In most circumstances, the threshold of discrimination for changes in health-related quality of life for chronic diseases appears to be approximately half a SD.
Publication
Journal: Stroke
June/19/2013
Abstract
OBJECTIVE
The authors present an overview of the current evidence and management recommendations for evaluation and treatment of adults with acute ischemic stroke. The intended audiences are prehospital care providers, physicians, allied health professionals, and hospital administrators responsible for the care of acute ischemic stroke patients within the first 48 hours from stroke onset. These guidelines supersede the prior 2007 guidelines and 2009 updates.
METHODS
Members of the writing committee were appointed by the American Stroke Association Stroke Council's Scientific Statement Oversight Committee, representing various areas of medical expertise. Strict adherence to the American Heart Association conflict of interest policy was maintained throughout the consensus process. Panel members were assigned topics relevant to their areas of expertise, reviewed the stroke literature with emphasis on publications since the prior guidelines, and drafted recommendations in accordance with the American Heart Association Stroke Council's Level of Evidence grading algorithm.
RESULTS
The goal of these guidelines is to limit the morbidity and mortality associated with stroke. The guidelines support the overarching concept of stroke systems of care and detail aspects of stroke care from patient recognition; emergency medical services activation, transport, and triage; through the initial hours in the emergency department and stroke unit. The guideline discusses early stroke evaluation and general medical care, as well as ischemic stroke, specific interventions such as reperfusion strategies, and general physiological optimization for cerebral resuscitation.
CONCLUSIONS
Because many of the recommendations are based on limited data, additional research on treatment of acute ischemic stroke remains urgently needed.
Publication
Journal: Clinical Neurophysiology
December/20/2009
Abstract
This article is based on a consensus conference, which took place in Certosa di Pontignano, Siena (Italy) on March 7-9, 2008, intended to update the previous safety guidelines for the application of transcranial magnetic stimulation (TMS) in research and clinical settings. Over the past decade the scientific and medical community has had the opportunity to evaluate the safety record of research studies and clinical applications of TMS and repetitive TMS (rTMS). In these years the number of applications of conventional TMS has grown impressively, new paradigms of stimulation have been developed (e.g., patterned repetitive TMS) and technical advances have led to new device designs and to the real-time integration of TMS with electroencephalography (EEG), positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). Thousands of healthy subjects and patients with various neurological and psychiatric diseases have undergone TMS allowing a better assessment of relative risks. The occurrence of seizures (i.e., the most serious TMS-related acute adverse effect) has been extremely rare, with most of the few new cases receiving rTMS exceeding previous guidelines, often in patients under treatment with drugs which potentially lower the seizure threshold. The present updated guidelines review issues of risk and safety of conventional TMS protocols, address the undesired effects and risks of emerging TMS interventions, the applications of TMS in patients with implanted electrodes in the central nervous system, and safety aspects of TMS in neuroimaging environments. We cover recommended limits of stimulation parameters and other important precautions, monitoring of subjects, expertise of the rTMS team, and ethical issues. While all the recommendations here are expert based, they utilize published data to the extent possible.
Publication
Journal: Behavioral and Brain Sciences
January/23/2002
Abstract
Miller (1956) summarized evidence that people can remember about seven chunks in short-term memory (STM) tasks. However, that number was meant more as a rough estimate and a rhetorical device than as a real capacity limit. Others have since suggested that there is a more precise capacity limit, but that it is only three to five chunks. The present target article brings together a wide variety of data on capacity limits suggesting that the smaller capacity limit is real. Capacity limits will be useful in analyses of information processing only if the boundary conditions for observing them can be carefully described. Four basic conditions in which chunks can be identified and capacity limits can accordingly be observed are: (1) when information overload limits chunks to individual stimulus items, (2) when other steps are taken specifically to block the recording of stimulus items into larger chunks, (3) in performance discontinuities caused by the capacity limit, and (4) in various indirect effects of the capacity limit. Under these conditions, rehearsal and long-term memory cannot be used to combine stimulus items into chunks of an unknown size; nor can storage mechanisms that are not capacity-limited, such as sensory memory, allow the capacity-limited storage mechanism to be refilled during recall. A single, central capacity limit averaging about four chunks is implicated along with other, noncapacity-limited sources. The pure STM capacity limit expressed in chunks is distinguished from compound STM limits obtained when the number of separately held chunks is unclear. Reasons why pure capacity estimates fall within a narrow range are discussed and a capacity limit for the focus of attention is proposed.
Authors
Publication
Journal: Biology of Blood and Marrow Transplantation
June/6/2006
Abstract
This consensus document is intended to serve 3 functions. First, it standardizes the criteria for diagnosis of chronic graft-versus-host disease (GVHD). Second, it proposes a new clinical scoring system (0-3) that describes the extent and severity of chronic GVHD for each organ or site at any given time, taking functional impact into account. Third, it proposes new guidelines for global assessment of chronic GVHD severity that are based on the number of organs or sites involved and the degree of involvement in affected organs (mild, moderate, or severe). Diagnosis of chronic GVHD requires the presence of at least 1 diagnostic clinical sign of chronic GVHD (e.g., poikiloderma or esophageal web) or the presence of at least 1 distinctive manifestation (e.g., keratoconjunctivitis sicca) confirmed by pertinent biopsy or other relevant tests (e.g., Schirmer test) in the same or another organ. Furthermore, other possible diagnoses for clinical symptoms must be excluded. No time limit is set for the diagnosis of chronic GVHD. The Working Group recognized 2 main categories of GVHD, each with 2 subcategories. The acute GVHD category is defined in the absence of diagnostic or distinctive features of chronic GVHD and includes (1) classic acute GVHD occurring within 100 days after transplantation and (2) persistent, recurrent, or late acute GVHD (features of acute GVHD occurring beyond 100 days, often during withdrawal of immune suppression). The broad category of chronic GVHD includes (1) classic chronic GVHD (without features or characteristics of acute GVHD) and (2) an overlap syndrome in which diagnostic or distinctive features of chronic GVHD and acute GVHD appear together. It is currently recommended that systemic therapy be considered for patients who meet criteria for chronic GVHD of moderate to severe global severity.
Publication
Journal: Nature Reviews Immunology
July/16/2008
Abstract
Regulatory T (T(Reg)) cells are essential for maintaining peripheral tolerance, preventing autoimmune diseases and limiting chronic inflammatory diseases. However, they also limit beneficial responses by suppressing sterilizing immunity and limiting antitumour immunity. Given that T(Reg) cells can have both beneficial and deleterious effects, there is considerable interest in determining their mechanisms of action. In this Review, we describe the basic mechanisms used by T(Reg) cells to mediate suppression and discuss whether one or many of these mechanisms are likely to be crucial for T(Reg)-cell function. In addition, we propose the hypothesis that effector T cells may not be 'innocent' parties in this suppressive process and might in fact potentiate T(Reg)-cell function.
Pulse
Views:
8
Posts:
No posts
Rating:
Not rated
Publication
Journal: Science
May/20/2009
Abstract
Reprogramming differentiated human cells to induced pluripotent stem (iPS) cells has applications in basic biology, drug development, and transplantation. Human iPS cell derivation previously required vectors that integrate into the genome, which can create mutations and limit the utility of the cells in both research and clinical applications. We describe the derivation of human iPS cells with the use of nonintegrating episomal vectors. After removal of the episome, iPS cells completely free of vector and transgene sequences are derived that are similar to human embryonic stem (ES) cells in proliferative and developmental potential. These results demonstrate that reprogramming human somatic cells does not require genomic integration or the continued presence of exogenous reprogramming factors and removes one obstacle to the clinical application of human iPS cells.
Publication
Journal: Medicine and Science in Sports and Exercise
May/12/2009
Abstract
Statistical guidelines and expert statements are now available to assist in the analysis and reporting of studies in some biomedical disciplines. We present here a more progressive resource for sample-based studies, meta-analyses, and case studies in sports medicine and exercise science. We offer forthright advice on the following controversial or novel issues: using precision of estimation for inferences about population effects in preference to null-hypothesis testing, which is inadequate for assessing clinical or practical importance; justifying sample size via acceptable precision or confidence for clinical decisions rather than via adequate power for statistical significance; showing SD rather than SEM, to better communicate the magnitude of differences in means and nonuniformity of error; avoiding purely nonparametric analyses, which cannot provide inferences about magnitude and are unnecessary; using regression statistics in validity studies, in preference to the impractical and biased limits of agreement; making greater use of qualitative methods to enrich sample-based quantitative projects; and seeking ethics approval for public access to the depersonalized raw data of a study, to address the need for more scrutiny of research and better meta-analyses. Advice on less contentious issues includes the following: using covariates in linear models to adjust for confounders, to account for individual differences, and to identify potential mechanisms of an effect; using log transformation to deal with nonuniformity of effects and error; identifying and deleting outliers; presenting descriptive, effect, and inferential statistics in appropriate formats; and contending with bias arising from problems with sampling, assignment, blinding, measurement error, and researchers' prejudices. This article should advance the field by stimulating debate, promoting innovative approaches, and serving as a useful checklist for authors, reviewers, and editors.
Publication
Journal: Acta crystallographica. Section D, Biological crystallography
October/16/2013
Abstract
Following integration of the observed diffraction spots, the process of `data reduction' initially aims to determine the point-group symmetry of the data and the likely space group. This can be performed with the program POINTLESS. The scaling program then puts all the measurements on a common scale, averages measurements of symmetry-related reflections (using the symmetry determined previously) and produces many statistics that provide the first important measures of data quality. A new scaling program, AIMLESS, implements scaling models similar to those in SCALA but adds some additional analyses. From the analyses, a number of decisions can be made about the quality of the data and whether some measurements should be discarded. The effective `resolution' of a data set is a difficult and possibly contentious question (particularly with referees of papers) and this is discussed in the light of tests comparing the data-processing statistics with trials of refinement against observed and simulated data, and automated model-building and comparison of maps calculated with different resolution limits. These trials show that adding weak high-resolution data beyond the commonly used limits may make some improvement and does no harm.
Publication
Journal: Nature
February/15/2007
Abstract
Although cancer arises from a combination of mutations in oncogenes and tumour suppressor genes, the extent to which tumour suppressor gene loss is required for maintaining established tumours is poorly understood. p53 is an important tumour suppressor that acts to restrict proliferation in response to DNA damage or deregulation of mitogenic oncogenes, by leading to the induction of various cell cycle checkpoints, apoptosis or cellular senescence. Consequently, p53 mutations increase cell proliferation and survival, and in some settings promote genomic instability and resistance to certain chemotherapies. To determine the consequences of reactivating the p53 pathway in tumours, we used RNA interference (RNAi) to conditionally regulate endogenous p53 expression in a mosaic mouse model of liver carcinoma. We show that even brief reactivation of endogenous p53 in p53-deficient tumours can produce complete tumour regressions. The primary response to p53 was not apoptosis, but instead involved the induction of a cellular senescence program that was associated with differentiation and the upregulation of inflammatory cytokines. This program, although producing only cell cycle arrest in vitro, also triggered an innate immune response that targeted the tumour cells in vivo, thereby contributing to tumour clearance. Our study indicates that p53 loss can be required for the maintenance of aggressive carcinomas, and illustrates how the cellular senescence program can act together with the innate immune system to potently limit tumour growth.
Publication
Journal: Journal of Chemical Information and Modeling
February/24/2005
Abstract
A critical barrier to entry into structure-based virtual screening is the lack of a suitable, easy to access database of purchasable compounds. We have therefore prepared a library of 727,842 molecules, each with 3D structure, using catalogs of compounds from vendors (the size of this library continues to grow). The molecules have been assigned biologically relevant protonation states and are annotated with properties such as molecular weight, calculated LogP, and number of rotatable bonds. Each molecule in the library contains vendor and purchasing information and is ready for docking using a number of popular docking programs. Within certain limits, the molecules are prepared in multiple protonation states and multiple tautomeric forms. In one format, multiple conformations are available for the molecules. This database is available for free download (http://zinc.docking.org) in several common file formats including SMILES, mol2, 3D SDF, and DOCK flexibase format. A Web-based query tool incorporating a molecular drawing interface enables the database to be searched and browsed and subsets to be created. Users can process their own molecules by uploading them to a server. Our hope is that this database will bring virtual screening libraries to a wide community of structural biologists and medicinal chemists.
Publication
Journal: Biophysical Journal
February/26/2007
Abstract
Biological structures span many orders of magnitude in size, but far-field visible light microscopy suffers from limited resolution. A new method for fluorescence imaging has been developed that can obtain spatial distributions of large numbers of fluorescent molecules on length scales shorter than the classical diffraction limit. Fluorescence photoactivation localization microscopy (FPALM) analyzes thousands of single fluorophores per acquisition, localizing small numbers of them at a time, at low excitation intensity. To control the number of visible fluorophores in the field of view and ensure that optically active molecules are separated by much more than the width of the point spread function, photoactivatable fluorescent molecules are used, in this case the photoactivatable green fluorescent protein (PA-GFP). For these photoactivatable molecules, the activation rate is controlled by the activation illumination intensity; nonfluorescent inactive molecules are activated by a high-frequency (405-nm) laser and are then fluorescent when excited at a lower frequency. The fluorescence is imaged by a CCD camera, and then the molecules are either reversibly inactivated or irreversibly photobleached to remove them from the field of view. The rate of photobleaching is controlled by the intensity of the laser used to excite the fluorescence, in this case an Ar+ ion laser. Because only a small number of molecules are visible at a given time, their positions can be determined precisely; with only approximately 100 detected photons per molecule, the localization precision can be as much as 10-fold better than the resolution, depending on background levels. Heterogeneities on length scales of the order of tens of nanometers are observed by FPALM of PA-GFP on glass. FPALM images are compared with images of the same molecules by widefield fluorescence. FPALM images of PA-GFP on a terraced sapphire crystal surface were compared with atomic force microscopy and show that the full width at half-maximum of features approximately 86 +/- 4 nm is significantly better than the expected diffraction-limited optical resolution. The number of fluorescent molecules and their brightness distribution have also been determined using FPALM. This new method suggests a means to address a significant number of biological questions that had previously been limited by microscope resolution.
load more...