Citations
All
Search in:AllTitleAbstractAuthor name
Publications
(511K+)
Patents
Grants
Pathways
Clinical trials
The language you are using is not recognised as English. To correctly search in your language please select Search and translation language
Publication
Journal: Visual Neuroscience
January/10/2001
Abstract
Absorbance spectra were recorded by microspectrophotometry from 39 different rod and cone types representing amphibians. reptiles, and fishes, with A1- or A2-based visual pigments and lambdamax ranging from 357 to 620 nm. The purpose was to investigate accuracy limits of putative universal templates for visual pigment absorbance spectra, and if possible to amend the templates to overcome the limitations. It was found that (1) the absorbance spectrum of frog rhodopsin extract very precisely parallels that of rod outer segments from the same individual, with only a slight hypsochromic shift in lambdamax, hence templates based on extracts are valid for absorbance in situ: (2) a template based on the bovine rhodopsin extract data of Partridge and De Grip (1991) describes the absorbance of amphibian rod outer segments excellently, contrary to recent electrophysiological results; (3) the lambdamax/lambda invariance of spectral shape fails for A1 pigments with small lambdamax and for A2 pigments with large lambdamax, but the deviations are systematic and can be readily incorporated into, for example, the Lamb (1995) template. We thus propose modified templates for the main "alpha-band" of A1 and A2 pigments and show that these describe both absorbance and spectral sensitivities of photoreceptors over the whole range of lambdamax. Subtraction of the alpha-band from the full absorbance spectrum leaves a "beta-band" described by a lambdamax-dependent Gaussian. We conclude that the idea of universal templates (one for A1- and one for A2-based visual pigments) remains valid and useful at the present level of accuracy of data on photoreceptor absorbance and sensitivity. The sum of our expressions for the alpha- and beta-band gives a good description for visual pigment spectra with lambdamax>> 350 nm.
Publication
Journal: Yeast
September/26/2004
Abstract
Green fluorescent protein (GFP) has become an increasingly popular protein tag for determining protein localization and abundance. With the availability of GFP variants with altered fluorescence spectra, as well as GFP homologues from other organisms, multi-colour fluorescence with protein tags is now possible, as is measuring protein interactions using fluorescence resonance energy transfer (FRET). We have created a set of yeast tagging vectors containing codon-optimized variants of GFP, CFP (cyan), YFP (yellow), and Sapphire (a UV-excitable GFP). These codon-optimized tags are twice as detectable as unoptimized tags. We have also created a tagging vector containing the monomeric DsRed construct tdimer2, which is up to 15-fold more detectable than tags currently in use. These tags significantly improve the detection limits for live-cell fluorescence imaging in yeast, and provide sufficient distinguishable fluorophores for four-colour imaging.
Publication
Journal: Journal of Pharmacokinetics and Pharmacodynamics
April/29/2002
Abstract
Pharmacokinetic data consist of drug concentration measurements, as well as reports of some measured concentrations being below the quantification limit of the assay (BQL). A pharmacokinetic model may befit to these data, and for this purpose, the BQL observations must be either discarded or handled in a special way. In this paper, seven methods for dealing with BQL observations are evaluated. Both single-subject and population data are simulated from a one-compartment model. A moderate amount of data is simulated for each individual. The actual cv of concentration measurements at the quantification limit is assumed to be no greater than 20%, in accord with the FDA Guidance. The results of this paper should be interpreted in this context. The methods include handling BQL observations as fixed-point censored observations, i.e., by using the likelihoods that these observations are in fact BQL. This method is shown to have some overall statistical advantage. However, the gain in using this method over that of simply discarding the BQL observations is not always much, and this is especially so when the frequency of BQL observations is small. Some simple methods entailing (i) replacing one or more BQL observations with the value 0, or (ii) replacing them with the value QL/2, where QL is the quantification limit, are also included. The first of these two approaches should not be used With population data, use of the second approach can result in some noticeably improved estimation of the typical value of a parameter, but then there is also marked degradation in the estimation of the population variance of the parameter.
Authors
Publication
Journal: American Journal of Clinical Nutrition
May/19/1999
Abstract
For adults, the 5-microg (200 IU) vitamin D recommended dietary allowance may prevent osteomalacia in the absence of sunlight, but more is needed to help prevent osteoporosis and secondary hyperparathyroidism. Other benefits of vitamin D supplementation are implicated epidemiologically: prevention of some cancers, osteoarthritis progression, multiple sclerosis, and hypertension. Total-body sun exposure easily provides the equivalent of 250 microg (10000 IU) vitamin D/d, suggesting that this is a physiologic limit. Sailors in US submarines are deprived of environmentally acquired vitamin D equivalent to 20-50 microg (800-2000 IU)/d. The assembled data from many vitamin D supplementation studies reveal a curve for vitamin D dose versus serum 25-hydroxyvitamin D [25(OH)D] response that is surprisingly flat up to 250 microg (10000 IU) vitamin D/d. To ensure that serum 25(OH)D concentrations exceed 100 nmol/L, a total vitamin D supply of 100 microg (4000 IU)/d is required. Except in those with conditions causing hypersensitivity, there is no evidence of adverse effects with serum 25(OH)D concentrations <140 nmol/L, which require a total vitamin D supply of 250 microg (10000 IU)/d to attain. Published cases of vitamin D toxicity with hypercalcemia, for which the 25(OH)D concentration and vitamin D dose are known, all involve intake of>> or = 1000 microg (40000 IU)/d. Because vitamin D is potentially toxic, intake of >25 microg (1000 IU)/d has been avoided even though the weight of evidence shows that the currently accepted, no observed adverse effect limit of 50 microg (2000 IU)/d is too low by at least 5-fold.
Authors
Publication
Journal: New England Journal of Medicine
July/28/2010
Abstract
BACKGROUND
We investigated whether intensive glycemic control, combination therapy for dyslipidemia, and intensive blood-pressure control would limit the progression of diabetic retinopathy in persons with type 2 diabetes. Previous data suggest that these systemic factors may be important in the development and progression of diabetic retinopathy.
METHODS
In a randomized trial, we enrolled 10,251 participants with type 2 diabetes who were at high risk for cardiovascular disease to receive either intensive or standard treatment for glycemia (target glycated hemoglobin level, <6.0% or 7.0 to 7.9%, respectively) and also for dyslipidemia (160 mg daily of fenofibrate plus simvastatin or placebo plus simvastatin) or for systolic blood-pressure control (target, <120 or <140 mm Hg). A subgroup of 2856 participants was evaluated for the effects of these interventions at 4 years on the progression of diabetic retinopathy by 3 or more steps on the Early Treatment Diabetic Retinopathy Study Severity Scale (as assessed from seven-field stereoscopic fundus photographs, with 17 possible steps and a higher number of steps indicating greater severity) or the development of diabetic retinopathy necessitating laser photocoagulation or vitrectomy.
RESULTS
At 4 years, the rates of progression of diabetic retinopathy were 7.3% with intensive glycemia treatment, versus 10.4% with standard therapy (adjusted odds ratio, 0.67; 95% confidence interval [CI], 0.51 to 0.87; P=0.003); 6.5% with fenofibrate for intensive dyslipidemia therapy, versus 10.2% with placebo (adjusted odds ratio, 0.60; 95% CI, 0.42 to 0.87; P=0.006); and 10.4% with intensive blood-pressure therapy, versus 8.8% with standard therapy (adjusted odds ratio, 1.23; 95% CI, 0.84 to 1.79; P=0.29).
CONCLUSIONS
Intensive glycemic control and intensive combination treatment of dyslipidemia, but not intensive blood-pressure control, reduced the rate of progression of diabetic retinopathy. (Funded by the National Heart, Lung, and Blood Institute and others; ClinicalTrials.gov numbers, NCT00000620 for the ACCORD study and NCT00542178 for the ACCORD Eye study.)
Publication
Journal: Cell
November/8/2016
Abstract
Cerebral organoids, three-dimensional cultures that model organogenesis, provide a new platform to investigate human brain development. High cost, variability, and tissue heterogeneity limit their broad applications. Here, we developed a miniaturized spinning bioreactor (SpinΩ) to generate forebrain-specific organoids from human iPSCs. These organoids recapitulate key features of human cortical development, including progenitor zone organization, neurogenesis, gene expression, and, notably, a distinct human-specific outer radial glia cell layer. We also developed protocols for midbrain and hypothalamic organoids. Finally, we employed the forebrain organoid platform to model Zika virus (ZIKV) exposure. Quantitative analyses revealed preferential, productive infection of neural progenitors with either African or Asian ZIKV strains. ZIKV infection leads to increased cell death and reduced proliferation, resulting in decreased neuronal cell-layer volume resembling microcephaly. Together, our brain-region-specific organoids and SpinΩ provide an accessible and versatile platform for modeling human brain development and disease and for compound testing, including potential ZIKV antiviral drugs.
Publication
Journal: Neuron
March/23/2015
Abstract
The blood-brain barrier (BBB) limits entry of blood-derived products, pathogens, and cells into the brain that is essential for normal neuronal functioning and information processing. Post-mortem tissue analysis indicates BBB damage in Alzheimer's disease (AD). The timing of BBB breakdown remains, however, elusive. Using an advanced dynamic contrast-enhanced MRI protocol with high spatial and temporal resolutions to quantify regional BBB permeability in the living human brain, we show an age-dependent BBB breakdown in the hippocampus, a region critical for learning and memory that is affected early in AD. The BBB breakdown in the hippocampus and its CA1 and dentate gyrus subdivisions worsened with mild cognitive impairment that correlated with injury to BBB-associated pericytes, as shown by the cerebrospinal fluid analysis. Our data suggest that BBB breakdown is an early event in the aging human brain that begins in the hippocampus and may contribute to cognitive impairment.
UNASSIGNED
Publication
Journal: Journal of Experimental Psychology: General
March/10/1985
Abstract
Theories of visual attention deal with the limit on our ability to see (and later report) several things at once. These theories fall into three broad classes. Object-based theories propose a limit on the number of separate objects that can be perceived simultaneously. Discrimination-based theories propose a limit on the number of separate discriminations that can be made. Space-based theories propose a limit on the spatial area from which information can be taken up. To distinguish these views, the present experiments used small (less than 1 degree), brief, foveal displays, each consisting of two overlapping objects (a box with a line struck through it). It was found that two judgments that concern the same object can be made simultaneously without loss of accuracy, whereas two judgments that concern different objects cannot. Neither the similarity nor the difficulty of required discriminations, nor the spatial distribution of information, could account for the results. The experiments support a view in which parallel, preattentive processes serve to segment the field into separate objects, followed by a process of focal attention that deals with only one object at a time. This view is also able to account for results taken to support both discrimination-based and space-based theories.
Authors
Publication
Journal: Circulation Research
March/24/2012
Abstract
Myocardial necrosis triggers an inflammatory reaction that clears the wound from dead cells and matrix debris, while activating reparative pathways necessary for scar formation. A growing body of evidence suggests that accentuation, prolongation, or expansion of the postinfarction inflammatory response results in worse remodeling and dysfunction following myocardial infarction. This review manuscript discusses the cellular effectors and endogenous molecular signals implicated in suppression and containment of the inflammatory response in the infarcted heart. Clearance of apoptotic neutrophils, recruitment of inhibitory monocyte subsets and regulatory T cells, macrophage differentiation and pericyte/endothelial interactions may play an active role in restraining postinfarction inflammation. Multiple molecular signals may be involved in suppressing the inflammatory cascade. Negative regulation of toll-like receptor signaling, downmodulation of cytokine responses, and termination of chemokine signals may be mediated through the concerted action of multiple suppressive pathways that prevent extension of injury and protect from adverse remodeling. Expression of soluble endogenous antagonists, decoy receptors, and posttranslational processing of bioactive molecules may limit cytokine and chemokine actions. Interleukin-10, members of the transforming growth factor-β family, and proresolving lipid mediators (such as lipoxins, resolvins, and protectins) may suppress proinflammatory signaling. In human patients with myocardial infarction, defective suppression, and impaired resolution of inflammation may be important mechanisms in the pathogenesis of remodeling and in progression to heart failure. Understanding of inhibitory and proresolving signals in the infarcted heart and identification of patients with uncontrolled postinfarction inflammation and defective cardiac repair is needed to design novel therapeutic strategies.
Publication
Journal: Experimental Hematology
July/9/2002
Abstract
The recent flood of reports using real-time Q-PCR testifies to the transformation of this technology from an experimental tool into the scientific mainstream. Many of the applications of real-time Q-PCR include measuring mRNA expression levels, DNA copy number, transgene copy number and expression analysis, allelic discrimination, and measuring viral titers. The range of applications of real-time Q-PCR is immense and has been fueled in part by the proliferation of lower-cost instrumentation and reagents. Successful application of real-time Q-PCR is not trivial. However, this review will help guide the reader through the variables that can limit the usefulness of this technology. Careful consideration of the assay design, template preparation, and analytical methods are essential for accurate gene quantification.
Publication
Journal: Nature Reviews Immunology
June/8/2017
Abstract
Immunogenicity depends on two key factors: antigenicity and adjuvanticity. The presence of exogenous or mutated antigens explains why infected cells and malignant cells can initiate an adaptive immune response provided that the cells also emit adjuvant signals as a consequence of cellular stress and death. Several infectious pathogens have devised strategies to control cell death and limit the emission of danger signals from dying cells, thereby avoiding immune recognition. Similarly, cancer cells often escape immunosurveillance owing to defects in the molecular machinery that underlies the release of endogenous adjuvants. Here, we review current knowledge on the mechanisms that underlie the activation of immune responses against dying cells and their pathophysiological relevance.
Publication
Journal: Globalization and Health
July/7/2020
Abstract
Background: The COVID-19 pandemic has had a significant impact on public mental health. Therefore, monitoring and oversight of the population mental health during crises such as a panedmic is an immediate priority. The aim of this study is to analyze the existing research works and findings in relation to the prevalence of stress, anxiety and depression in the general population during the COVID-19 pandemic.
Method: In this systematic review and meta-analysis, articles that have focused on stress and anxiety prevalence among the general population during the COVID-19 pandemic were searched in the Science Direct, Embase, Scopus, PubMed, Web of Science (ISI) and Google Scholar databases, without a lower time limit and until May 2020. In order to perform a meta-analysis of the collected studies, the random effects model was used, and the heterogeneity of studies was investigated using the I2 index. Moreover. data analysis was conducted using the Comprehensive Meta-Analysis (CMA) software.
Results: The prevalence of stress in 5 studies with a total sample size of 9074 is obtained as 29.6% (95% confidence limit: 24.3-35.4), the prevalence of anxiety in 17 studies with a sample size of 63,439 as 31.9% (95% confidence interval: 27.5-36.7), and the prevalence of depression in 14 studies with a sample size of 44,531 people as 33.7% (95% confidence interval: 27.5-40.6).
Conclusion: COVID-19 not only causes physical health concerns but also results in a number of psychological disorders. The spread of the new coronavirus can impact the mental health of people in different communities. Thus, it is essential to preserve the mental health of individuals and to develop psychological interventions that can improve the mental health of vulnerable groups during the COVID-19 pandemic.
Keywords: Anxiety; COVID-19; Coronavirus; Depression; General population; Meta-analysis; Prevalence; Stress; Systematic review.
Publication
Journal: Neuron
August/10/2009
Abstract
Examining the behavioral consequences of selective CNS neuronal activation is a powerful tool for elucidating mammalian brain function in health and disease. Newly developed genetic, pharmacological, and optical tools allow activation of neurons with exquisite spatiotemporal resolution; however, the inaccessibility to light of widely distributed neuronal populations and the invasiveness required for activation by light or infused ligands limit the utility of these methods. To overcome these barriers, we created transgenic mice expressing an evolved G protein-coupled receptor (hM3Dq) selectively activated by the pharmacologically inert, orally bioavailable drug clozapine-N-oxide (CNO). Here, we expressed hM3Dq in forebrain principal neurons. Local field potential and single-neuron recordings revealed that peripheral administration of CNO activated hippocampal neurons selectively in hM3Dq-expressing mice. Behavioral correlates of neuronal activation included increased locomotion, stereotypy, and limbic seizures. These results demonstrate a powerful chemical-genetic tool for remotely controlling the activity of discrete populations of neurons in vivo.
Publication
Journal: Biochemistry
November/17/1998
Abstract
One of the ubiquitous features of membrane proteins is the preference of tryptophan and tyrosine residues for membrane surfaces that presumably arises from enhanced stability due to distinct interfacial interactions. The physical basis for this preference is widely believed to arise from amphipathic interactions related to imino group hydrogen bonding and/or dipole interactions. We have examined these and other possibilities for tryptophan's interfacial preference by using 1H magic angle spinning (MAS) chemical shift measurements, two-dimensional (2D) nuclear Overhauser effect spectroscopy (2D-NOESY) 1H MAS NMR, and solid state 2H NMR to study the interactions of four tryptophan analogues with phosphatidylcholine membranes. We find that the analogues reside in the vicinity of the glycerol group where they all cause similar modest changes in acyl chain organization and that hydrocarbon penetration was not increased by reduction of hydrogen bonding or electric dipole interaction ability. These observations rule out simple amphipathic or dipolar interactions as the physical basis for the interfacial preference. More likely, the preference is dominated by tryptophan's flat rigid shape that limits access to the hydrocarbon core and its pi electronic structure and associated quadrupolar moment (aromaticity) that favor residing in the electrostatically complex interface environment.
Publication
Journal: Endocrine Reviews
June/9/2004
Abstract
During the past decade, possible advancement in timing of puberty has been reported in the United States. In addition, early pubertal development and an increased incidence of sexual precocity have been noticed in children, primarily girls, migrating for foreign adoption in several Western European countries. These observations are raising the issues of current differences and secular trends in timing of puberty in relation to ethnic, geographical, and socioeconomic background. None of these factors provide an unequivocal explanation for the earlier onset of puberty seen in the United States. In the formerly deprived migrating children, refeeding and catch-up growth may prime maturation. However, precocious puberty is seen also in some nondeprived migrating children. Attention has been paid to the changing milieu after migration, and recently, the possible role of endocrine- disrupting chemicals from the environment has been considered. These observations urge further study of the onset of puberty as a possible sensitive and early marker of the interactions between environmental conditions and genetic susceptibility that can influence physiological and pathological processes.
Publication
Journal: Nature Reviews Genetics
March/12/2014
Abstract
Sequencing technologies have placed a wide range of genomic analyses within the capabilities of many laboratories. However, sequencing costs often set limits to the amount of sequences that can be generated and, consequently, the biological outcomes that can be achieved from an experimental design. In this Review, we discuss the issue of sequencing depth in the design of next-generation sequencing experiments. We review current guidelines and precedents on the issue of coverage, as well as their underlying considerations, for four major study designs, which include de novo genome sequencing, genome resequencing, transcriptome sequencing and genomic location analyses (for example, chromatin immunoprecipitation followed by sequencing (ChIP-seq) and chromosome conformation capture (3C)).
Publication
Journal: International Review of Cell and Molecular Biology
April/9/2013
Abstract
Disorders characterized by ischemia/reperfusion (I/R), such as myocardial infarction, stroke, and peripheral vascular disease, continue to be among the most frequent causes of debilitating disease and death. Tissue injury and/or death occur as a result of the initial ischemic insult, which is determined primarily by the magnitude and duration of the interruption in the blood supply, and then subsequent damage induced by reperfusion. During prolonged ischemia, ATP levels and intracellular pH decrease as a result of anaerobic metabolism and lactate accumulation. As a consequence, ATPase-dependent ion transport mechanisms become dysfunctional, contributing to increased intracellular and mitochondrial calcium levels (calcium overload), cell swelling and rupture, and cell death by necrotic, necroptotic, apoptotic, and autophagic mechanisms. Although oxygen levels are restored upon reperfusion, a surge in the generation of reactive oxygen species occurs and proinflammatory neutrophils infiltrate ischemic tissues to exacerbate ischemic injury. The pathologic events induced by I/R orchestrate the opening of the mitochondrial permeability transition pore, which appears to represent a common end-effector of the pathologic events initiated by I/R. The aim of this treatise is to provide a comprehensive review of the mechanisms underlying the development of I/R injury, from which it should be apparent that a combination of molecular and cellular approaches targeting multiple pathologic processes to limit the extent of I/R injury must be adopted to enhance resistance to cell death and increase regenerative capacity in order to effect long-lasting repair of ischemic tissues.
Publication
Journal: Journal of Clinical Microbiology
March/2/2010
Abstract
Current nucleic acid amplification methods to detect Mycobacterium tuberculosis are complex, labor-intensive, and technically challenging. We developed and performed the first analysis of the Cepheid Gene Xpert System's MTB/RIF assay, an integrated hands-free sputum-processing and real-time PCR system with rapid on-demand, near-patient technology, to simultaneously detect M. tuberculosis and rifampin resistance. Analytic tests of M. tuberculosis DNA demonstrated a limit of detection (LOD) of 4.5 genomes per reaction. Studies using sputum spiked with known numbers of M. tuberculosis CFU predicted a clinical LOD of 131 CFU/ml. Killing studies showed that the assay's buffer decreased M. tuberculosis viability by at least 8 logs, substantially reducing biohazards. Tests of 23 different commonly occurring rifampin resistance mutations demonstrated that all 23 (100%) would be identified as rifampin resistant. An analysis of 20 nontuberculosis mycobacteria species confirmed high assay specificity. A small clinical validation study of 107 clinical sputum samples from suspected tuberculosis cases in Vietnam detected 29/29 (100%) smear-positive culture-positive cases and 33/39 (84.6%) or 38/53 (71.7%) smear-negative culture-positive cases, as determined by growth on solid medium or on both solid and liquid media, respectively. M. tuberculosis was not detected in 25/25 (100%) of the culture-negative samples. A study of 64 smear-positive culture-positive sputa from retreatment tuberculosis cases in Uganda detected 63/64 (98.4%) culture-positive cases and 9/9 (100%) cases of rifampin resistance. Rifampin resistance was excluded in 54/55 (98.2%) susceptible cases. Specificity rose to 100% after correcting for a conventional susceptibility test error. In conclusion, this highly sensitive and simple-to-use system can detect M. tuberculosis directly from sputum in less than 2 h.
Publication
Journal: Nature
September/13/2000
Abstract
Sample damage by X-rays and other radiation limits the resolution of structural studies on non-repetitive and non-reproducible structures such as individual biomolecules or cells. Cooling can slow sample deterioration, but cannot eliminate damage-induced sample movement during the time needed for conventional measurements. Analyses of the dynamics of damage formation suggest that the conventional damage barrier (about 200 X-ray photons per A2 with X-rays of 12 keV energy or 1 A wavelength) may be extended at very high dose rates and very short exposure times. Here we have used computer simulations to investigate the structural information that can be recovered from the scattering of intense femtosecond X-ray pulses by single protein molecules and small assemblies. Estimations of radiation damage as a function of photon energy, pulse length, integrated pulse intensity and sample size show that experiments using very high X-ray dose rates and ultrashort exposures may provide useful structural information before radiation damage destroys the sample. We predict that such ultrashort, high-intensity X-ray pulses from free-electron lasers that are currently under development, in combination with container-free sample handling methods based on spraying techniques, will provide a new approach to structural determinations with X-rays.
Publication
Journal: Nature Reviews Cancer
February/10/2009
Abstract
Oncogene-induced cellular senescence constitutes a strong anti-proliferative response, which can be set in motion following either oncogene activation or loss of tumour suppressor signalling. It serves to limit the expansion of early neoplastic cells and as such is a potent cancer-protective response to oncogenic events. Recently emerging evidence points to a crucial role in oncogene-induced cellular senescence for the 'senescence-messaging secretome' or SMS, setting the stage for cross-talk between senescent cells and their environment. How are such signals integrated into a coordinated response and what are the implications of this unexpected finding?
Publication
Journal: Journal of Cell Biology
August/26/2010
Abstract
For centuries, cell biology has been based on light microscopy and at the same time been limited by its optical resolution. However, several new technologies have been developed recently that bypass this limit. These new super-resolution technologies are either based on tailored illumination, nonlinear fluorophore responses, or the precise localization of single molecules. Overall, these new approaches have created unprecedented new possibilities to investigate the structure and function of cells.
Publication
Journal: Cytometry
March/30/1993
Abstract
A combination of fluorescent rRNA-targeted oligonucleotide probes ("phylogenetic stains") and flow cytometry was used for a high resolution automated analysis of mixed microbial populations. Fixed cells of bacteria and yeasts were hybridized in suspension with fluorescein- or tetramethylrhodamine-labeled oligonucleotide probes complementary to group-specific regions of the 16S ribosomal RNA (rRNA) molecules. Quantifying probe-conferred cell fluorescence by flow cytometry, we could discriminate between target and nontarget cell populations. We critically examined changes of the hybridization conditions, kinetics of the hybridization, and posthybridization treatments. Intermediate probe concentrations, addition of detergent to the hybridization buffer, and a posthybridization washing step were found to increase the signal to noise ratio. We could demonstrate a linear correlation between growth rate and probe-conferred fluorescence of Escherichia coli and Pseudomonas cepacia cells. Oligonucleotides labeled with multiple fluorochromes showed elevated levels of nonspecific binding and therefore could not be used to lower the detection limits, which still restrict studies with fluorescing rRNA-targeted oligonucleotide probes to well-growing microbial cells. Two probes of different specificities--one labeled with fluorescein, the other with tetramethylrhodamine--could be applied simultaneously for dual color analysis.
Publication
Journal: Journal of Biomedical Informatics
August/11/2002
Abstract
Narrative reports in medical records contain a wealth of information that may augment structured data for managing patient information and predicting trends in diseases. Pertinent negatives are evident in text but are not usually indexed in structured databases. The objective of the study reported here was to test a simple algorithm for determining whether a finding or disease mentioned within narrative medical reports is present or absent. We developed a simple regular expression algorithm called NegEx that implements several phrases indicating negation, filters out sentences containing phrases that falsely appear to be negation phrases, and limits the scope of the negation phrases. We compared NegEx against a baseline algorithm that has a limited set of negation phrases and a simpler notion of scope. In a test of 1235 findings and diseases in 1000 sentences taken from discharge summaries indexed by physicians, NegEx had a specificity of 94.5% (versus 85.3% for the baseline), a positive predictive value of 84.5% (versus 68.4% for the baseline) while maintaining a reasonable sensitivity of 77.8% (versus 88.3% for the baseline). We conclude that with little implementation effort a simple regular expression algorithm for determining whether a finding or disease is absent can identify a large portion of the pertinent negatives from discharge summaries.
Publication
Journal: Environmental Health Perspectives
August/2/2000
Abstract
Misclassification of exposure is a well-recognized inherent limitation of epidemiologic studies of disease and the environment. For many agents of interest, exposures take place over time and in multiple locations; accurately estimating the relevant exposures for an individual participant in epidemiologic studies is often daunting, particularly within the limits set by feasibility, participant burden, and cost. Researchers have taken steps to deal with the consequences of measurement error by limiting the degree of error through a study's design, estimating the degree of error using a nested validation study, and by adjusting for measurement error in statistical analyses. In this paper, we address measurement error in observational studies of air pollution and health. Because measurement error may have substantial implications for interpreting epidemiologic studies on air pollution, particularly the time-series analyses, we developed a systematic conceptual formulation of the problem of measurement error in epidemiologic studies of air pollution and then considered the consequences within this formulation. When possible, we used available relevant data to make simple estimates of measurement error effects. This paper provides an overview of measurement errors in linear regression, distinguishing two extremes of a continuum-Berkson from classical type errors, and the univariate from the multivariate predictor case. We then propose one conceptual framework for the evaluation of measurement errors in the log-linear regression used for time-series studies of particulate air pollution and mortality and identify three main components of error. We present new simple analyses of data on exposures of particulate matter < 10 microm in aerodynamic diameter from the Particle Total Exposure Assessment Methodology Study. Finally, we summarize open questions regarding measurement error and suggest the kind of additional data necessary to address them.
load more...