Citations
All
Search in:AllTitleAbstractAuthor name
Publications
(608K+)
Patents
Grants
Pathways
Clinical trials
The language you are using is not recognised as English. To correctly search in your language please select Search and translation language
Publication
Journal: Cell
July/12/2010
Abstract
Recent structural studies of receptor tyrosine kinases (RTKs) have revealed unexpected diversity in the mechanisms of their activation by growth factor ligands. Strategies for inducing dimerization by ligand binding are surprisingly diverse, as are mechanisms that couple this event to activation of the intracellular tyrosine kinase domains. As our understanding of these details becomes increasingly sophisticated, it provides an important context for therapeutically countering the effects of pathogenic RTK mutations in cancer and other diseases. Much remains to be learned, however, about the complex signaling networks downstream from RTKs and how alterations in these networks are translated into cellular responses.
Publication
Journal: Nucleic Acids Research
May/20/1981
Abstract
This paper presents a new computer method for folding an RNA molecule that finds a conformation of minimum free energy using published values of stacking and destabilizing energies. It is based on a dynamic programming algorithm from applied mathematics, and is much more efficient, faster, and can fold larger molecules than procedures which have appeared up to now in the biological literature. Its power is demonstrated in the folding of a 459 nucleotide immunoglobulin gamma 1 heavy chain messenger RNA fragment. We go beyond the basic method to show how to incorporate additional information into the algorithm. This includes data on chemical reactivity and enzyme susceptibility. We illustrate this with the folding of two large fragments from the 16S ribosomal RNA of Escherichia coli.
Publication
Journal: Journal of Chemical Physics
July/6/2010
Abstract
The method of dispersion correction as an add-on to standard Kohn-Sham density functional theory (DFT-D) has been refined regarding higher accuracy, broader range of applicability, and less empiricism. The main new ingredients are atom-pairwise specific dispersion coefficients and cutoff radii that are both computed from first principles. The coefficients for new eighth-order dispersion terms are computed using established recursion relations. System (geometry) dependent information is used for the first time in a DFT-D type approach by employing the new concept of fractional coordination numbers (CN). They are used to interpolate between dispersion coefficients of atoms in different chemical environments. The method only requires adjustment of two global parameters for each density functional, is asymptotically exact for a gas of weakly interacting neutral atoms, and easily allows the computation of atomic forces. Three-body nonadditivity terms are considered. The method has been assessed on standard benchmark sets for inter- and intramolecular noncovalent interactions with a particular emphasis on a consistent description of light and heavy element systems. The mean absolute deviations for the S22 benchmark set of noncovalent interactions for 11 standard density functionals decrease by 15%-40% compared to the previous (already accurate) DFT-D version. Spectacular improvements are found for a tripeptide-folding model and all tested metallic systems. The rectification of the long-range behavior and the use of more accurate C(6) coefficients also lead to a much better description of large (infinite) systems as shown for graphene sheets and the adsorption of benzene on an Ag(111) surface. For graphene it is found that the inclusion of three-body terms substantially (by about 10%) weakens the interlayer binding. We propose the revised DFT-D method as a general tool for the computation of the dispersion energy in molecules and solids of any kind with DFT and related (low-cost) electronic structure methods for large systems.
Publication
Journal: Journal of Computational Biology
November/12/2000
Abstract
For aligning DNA sequences that differ only by sequencing errors, or by equivalent errors from other sources, a greedy algorithm can be much faster than traditional dynamic programming approaches and yet produce an alignment that is guaranteed to be theoretically optimal. We introduce a new greedy alignment algorithm with particularly good performance and show that it computes the same alignment as does a certain dynamic programming algorithm, while executing over 10 times faster on appropriate data. An implementation of this algorithm is currently used in a program that assembles the UniGene database at the National Center for Biotechnology Information.
Publication
Journal: Science
April/5/1995
Abstract
In multicellular organisms, homeostasis is maintained through a balance between cell proliferation and cell death. Although much is known about the control of cell proliferation, less is known about the control of cell death. Physiologic cell death occurs primarily through an evolutionarily conserved form of cell suicide termed apoptosis. The decision of a cell to undergo apoptosis can be influenced by a wide variety of regulatory stimuli. Recent evidence suggests that alterations in cell survival contribute to the pathogenesis of a number of human diseases, including cancer, viral infections, autoimmune diseases, neurodegenerative disorders, and AIDS (acquired immunodeficiency syndrome). Treatments designed to specifically alter the apoptotic threshold may have the potential to change the natural progression of some of these diseases.
Publication
Journal: Bioinformatics
August/4/2013
Abstract
CONCLUSIONS
CD-HIT is a widely used program for clustering biological sequences to reduce sequence redundancy and improve the performance of other sequence analyses. In response to the rapid increase in the amount of sequencing data produced by the next-generation sequencing technologies, we have developed a new CD-HIT program accelerated with a novel parallelization strategy and some other techniques to allow efficient clustering of such datasets. Our tests demonstrated very good speedup derived from the parallelization for up to ∼24 cores and a quasi-linear speedup for up to ∼8 cores. The enhanced CD-HIT is capable of handling very large datasets in much shorter time than previous versions.
BACKGROUND
http://cd-hit.org.
BACKGROUND
liwz@sdsc.edu
BACKGROUND
Supplementary data are available at Bioinformatics online.
Publication
Journal: Genome Research
February/11/1997
Abstract
We have developed a novel "real time" quantitative PCR method. The method measures PCR product accumulation through a dual-labeled fluorogenic probe (i.e., TaqMan Probe). This method provides very accurate and reproducible quantitation of gene copies. Unlike other quantitative PCR methods, real-time PCR does not require post-PCR sample handling, preventing potential PCR product carry-over contamination and resulting in much faster and higher throughput assays. The real-time PCR method has a very large dynamic range of starting target molecule determination (at least five orders of magnitude). Real-time quantitative PCR is extremely accurate and less labor-intensive than current quantitative PCR methods.
Pulse
Views:
1
Posts:
No posts
Rating:
Not rated
Publication
Journal: Systematic Biology
December/27/2007
Abstract
Alignment quality may have as much impact on phylogenetic reconstruction as the phylogenetic methods used. Not only the alignment algorithm, but also the method used to deal with the most problematic alignment regions, may have a critical effect on the final tree. Although some authors remove such problematic regions, either manually or using automatic methods, in order to improve phylogenetic performance, others prefer to keep such regions to avoid losing any information. Our aim in the present work was to examine whether phylogenetic reconstruction improves after alignment cleaning or not. Using simulated protein alignments with gaps, we tested the relative performance in diverse phylogenetic analyses of the whole alignments versus the alignments with problematic regions removed with our previously developed Gblocks program. We also tested the performance of more or less stringent conditions in the selection of blocks. Alignments constructed with different alignment methods (ClustalW, Mafft, and Probcons) were used to estimate phylogenetic trees by maximum likelihood, neighbor joining, and parsimony. We show that, in most alignment conditions, and for alignments that are not too short, removal of blocks leads to better trees. That is, despite losing some information, there is an increase in the actual phylogenetic signal. Overall, the best trees are obtained by maximum-likelihood reconstruction of alignments cleaned by Gblocks. In general, a relaxed selection of blocks is better for short alignment, whereas a stringent selection is more adequate for longer ones. Finally, we show that cleaned alignments produce better topologies although, paradoxically, with lower bootstrap. This indicates that divergent and problematic alignment regions may lead, when present, to apparently better supported although, in fact, more biased topologies.
Publication
Journal: Annual Review of Biochemistry
February/26/2002
Abstract
The conjugation of ubiquitin to other cellular proteins regulates a broad range of eukaryotic cell functions. The high efficiency and exquisite selectivity of ubiquitination reactions reflect the properties of enzymes known as ubiquitin-protein ligases or E3s. An E3 recognizes its substrates based on the presence of a specific ubiquitination signal, and catalyzes the formation of an isopeptide bond between a substrate (or ubiquitin) lysine residue and the C terminus of ubiquitin. Although a great deal is known about the molecular basis of E3 specificity, much less is known about molecular mechanisms of catalysis by E3s. Recent findings reveal that all known E3s utilize one of just two catalytic domains--a HECT domain or a RING finger--and crystal structures have provided the first detailed views of an active site of each type. The new findings shed light on many aspects of E3 structure, function, and mechanism, but also emphasize that key features of E3 catalysis remain to be elucidated.
Publication
Journal: Nucleic Acids Research
March/12/1982
Abstract
We describe the use of gel electrophoresis in studies of equilibrium binding, site distribution, and kinetics of protein-DNA interactions. The method, which we call protein distribution analysis, is simple, sensitive and yields thermodynamically rigorous results. It is particularly well suited to studies of simultaneous binding of several proteins to a single nucleic acid. In studies of the lac repressor-operator interaction, we found that binding to the so-called third operator site (03) is 15-18 fold weaker than operator binding, and that the binding reactions with the first and third operators are uncoupled, implying that there is no communication between the sites. Pseudo-first order dissociation kinetics of the repressor-203 bp operator complex were found to be temperature sensitive, with delta E of 80 kcal mol-1 above 29 degrees C and 26 kcal mol-1 below. The half life of the complex (5 min at 21 degrees C) is shorter than that reported for very high molecular weight operator-containing DNAs, but longer than values reported for much shorter fragments. The binding of lac repressor core to DNA could not be detected by this technique: the maximum binding constant consistent with this finding is 10(5) M-1.
Publication
Journal: Nature Reviews Genetics
May/12/2008
Abstract
The past year has witnessed substantial advances in understanding the genetic basis of many common phenotypes of biomedical importance. These advances have been the result of systematic, well-powered, genome-wide surveys exploring the relationships between common sequence variation and disease predisposition. This approach has revealed over 50 disease-susceptibility loci and has provided insights into the allelic architecture of multifactorial traits. At the same time, much has been learned about the successful prosecution of association studies on such a scale. This Review highlights the knowledge gained, defines areas of emerging consensus, and describes the challenges that remain as researchers seek to obtain more complete descriptions of the susceptibility architecture of biomedical traits of interest and to translate the information gathered into improvements in clinical management.
Publication
Journal: Nature
February/1/2005
Abstract
At birth the trans-placental nutrient supply is suddenly interrupted, and neonates face severe starvation until supply can be restored through milk nutrients. Here, we show that neonates adapt to this adverse circumstance by inducing autophagy. Autophagy is the primary means for the degradation of cytoplasmic constituents within lysosomes. The level of autophagy in mice remains low during embryogenesis; however, autophagy is immediately upregulated in various tissues after birth and is maintained at high levels for 3-12 h before returning to basal levels within 1-2 days. Mice deficient for Atg5, which is essential for autophagosome formation, appear almost normal at birth but die within 1 day of delivery. The survival time of starved Atg5-deficient neonates (approximately 12 h) is much shorter than that of wild-type mice (approximately 21 h) but can be prolonged by forced milk feeding. Atg5-deficient neonates exhibit reduced amino acid concentrations in plasma and tissues, and display signs of energy depletion. These results suggest that the production of amino acids by autophagic degradation of 'self' proteins, which allows for the maintenance of energy homeostasis, is important for survival during neonatal starvation.
Publication
Journal: Autophagy
March/18/2008
Abstract
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Pulse
Views:
1
Posts:
No posts
Rating:
Not rated
Publication
Journal: Plant Physiology
January/22/2006
Abstract
Gene transcripts with invariant abundance during development and in the face of environmental stimuli are essential reference points for accurate gene expression analyses, such as RNA gel-blot analysis or quantitative reverse transcription-polymerase chain reaction (PCR). An exceptionally large set of data from Affymetrix ATH1 whole-genome GeneChip studies provided the means to identify a new generation of reference genes with very stable expression levels in the model plant species Arabidopsis (Arabidopsis thaliana). Hundreds of Arabidopsis genes were found that outperform traditional reference genes in terms of expression stability throughout development and under a range of environmental conditions. Most of these were expressed at much lower levels than traditional reference genes, making them very suitable for normalization of gene expression over a wide range of transcript levels. Specific and efficient primers were developed for 22 genes and tested on a diverse set of 20 cDNA samples. Quantitative reverse transcription-PCR confirmed superior expression stability and lower absolute expression levels for many of these genes, including genes encoding a protein phosphatase 2A subunit, a coatomer subunit, and an ubiquitin-conjugating enzyme. The developed PCR primers or hybridization probes for the novel reference genes will enable better normalization and quantification of transcript levels in Arabidopsis in the future.
Publication
Journal: Nucleic Acids Research
April/27/2011
Abstract
COSMIC (http://www.sanger.ac.uk/cosmic) curates comprehensive information on somatic mutations in human cancer. Release v48 (July 2010) describes over 136,000 coding mutations in almost 542,000 tumour samples; of the 18,490 genes documented, 4803 (26%) have one or more mutations. Full scientific literature curations are available on 83 major cancer genes and 49 fusion gene pairs (19 new cancer genes and 30 new fusion pairs this year) and this number is continually increasing. Key amongst these is TP53, now available through a collaboration with the IARC p53 database. In addition to data from the Cancer Genome Project (CGP) at the Sanger Institute, UK, and The Cancer Genome Atlas project (TCGA), large systematic screens are also now curated. Major website upgrades now make these data much more mineable, with many new selection filters and graphics. A Biomart is now available allowing more automated data mining and integration with other biological databases. Annotation of genomic features has become a significant focus; COSMIC has begun curating full-genome resequencing experiments, developing new web pages, export formats and graphics styles. With all genomic information recently updated to GRCh37, COSMIC integrates many diverse types of mutation information and is making much closer links with Ensembl and other data resources.
Publication
Journal: Science
July/4/1988
Abstract
Diagnostic systems of several kinds are used to distinguish between two classes of events, essentially "signals" and "noise". For them, analysis in terms of the "relative operating characteristic" of signal detection theory provides a precise and valid measure of diagnostic accuracy. It is the only measure available that is uninfluenced by decision biases and prior probabilities, and it places the performances of diverse systems on a common, easily interpreted scale. Representative values of this measure are reported here for systems in medical imaging, materials testing, weather forecasting, information retrieval, polygraph lie detection, and aptitude testing. Though the measure itself is sound, the values obtained from tests of diagnostic systems often require qualification because the test data on which they are based are of unsure quality. A common set of problems in testing is faced in all fields. How well these problems are handled, or can be handled in a given field, determines the degree of confidence that can be placed in a measured value of accuracy. Some fields fare much better than others.
Authors
Publication
Journal: The Lancet
July/22/2012
Abstract
BACKGROUND
Long-term disorders are the main challenge facing health-care systems worldwide, but health systems are largely configured for individual diseases rather than multimorbidity. We examined the distribution of multimorbidity, and of comorbidity of physical and mental health disorders, in relation to age and socioeconomic deprivation.
METHODS
In a cross-sectional study we extracted data on 40 morbidities from a database of 1,751,841 people registered with 314 medical practices in Scotland as of March, 2007. We analysed the data according to the number of morbidities, disorder type (physical or mental), sex, age, and socioeconomic status. We defined multimorbidity as the presence of two or more disorders.
RESULTS
42·2% (95% CI 42·1-42·3) of all patients had one or more morbidities, and 23·2% (23·08-23·21) were multimorbid. Although the prevalence of multimorbidity increased substantially with age and was present in most people aged 65 years and older, the absolute number of people with multimorbidity was higher in those younger than 65 years (210,500 vs 194,996). Onset of multimorbidity occurred 10-15 years earlier in people living in the most deprived areas compared with the most affluent, with socioeconomic deprivation particularly associated with multimorbidity that included mental health disorders (prevalence of both physical and mental health disorder 11·0%, 95% CI 10·9-11·2% in most deprived area vs 5·9%, 5·8%-6·0% in least deprived). The presence of a mental health disorder increased as the number of physical morbidities increased (adjusted odds ratio 6·74, 95% CI 6·59-6·90 for five or more disorders vs 1·95, 1·93-1·98 for one disorder), and was much greater in more deprived than in less deprived people (2·28, 2·21-2·32 vs 1·08, 1·05-1·11).
CONCLUSIONS
Our findings challenge the single-disease framework by which most health care, medical research, and medical education is configured. A complementary strategy is needed, supporting generalist clinicians to provide personalised, comprehensive continuity of care, especially in socioeconomically deprived areas.
BACKGROUND
Scottish Government Chief Scientist Office.
Pulse
Views:
1
Posts:
No posts
Rating:
Not rated
Publication
Journal: Journal of Immunology
February/1/1976
Abstract
The Cowan I strain of the bacterium Staphylococcus aureus has been used as an adsorbent for antibodies complexed with radiolabeled antigens from cell lysates. This application is advanced as a superior alternative to other methods of immune precipitation for the isolation of antigens. It exploits the high adsorption capacity for IgG molecules by protein A molecules on the cell walls of certain strains of staphylococci, along with the advantageous sedimentation properties of the bacteria. The interaction of immune complexes with the adsorbent was defined initially using a model system of bovine serum albumin with a high excess of rabbit anti-bovine serum albumin antibodies (IgG). The uptake of immune complexes under these conditions was extremely rapid, occurring within seconds, whereas maximum binding of free IgG was much slower. In addition, once bound the complexed antigen could not be displaced from the adsorbent either by large amounts of normal IgG or by extra free antibody. Antigen could be eluted almost completely from the inert adsorbent for analytic or preparative purposes with a variety of solvent systems, such as the detergent SDS in combination with urea and high temperature, and neutral salts with strong lyotropic salting in properties. The efficacy of the protein A-antibody adsorption technique was tested in direct comparisons with a conventional double antibody precipitation method for the isolation of mouse lymphocyte IgM. The bacterial adsorbent not only had a distinct advantage in speed of antigen isolation, but analyses by polyacrylamide gel electrophoresis in SDS also revealed consistently higher antigen recoveries, lower levels of background radioactivity, and an absence of other cell components which may nonspecifically bind to and complicate analyses using conventional immune precipitates.
Publication
Journal: Journal of Clinical Epidemiology
September/14/2008
Abstract
Much of biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study's generalizability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control, and cross-sectional studies. We convened a 2-day workshop in September 2004, with methodologists, researchers, and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results, and discussion sections of articles. Eighteen items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed Explanation and Elaboration document is published separately and is freely available on the web sites of PLoS Medicine, Annals of Internal Medicine, and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies.
Publication
Journal: The Lancet
April/12/2009
Abstract
BACKGROUND
The main associations of body-mass index (BMI) with overall and cause-specific mortality can best be assessed by long-term prospective follow-up of large numbers of people. The Prospective Studies Collaboration aimed to investigate these associations by sharing data from many studies.
METHODS
Collaborative analyses were undertaken of baseline BMI versus mortality in 57 prospective studies with 894 576 participants, mostly in western Europe and North America (61% [n=541 452] male, mean recruitment age 46 [SD 11] years, median recruitment year 1979 [IQR 1975-85], mean BMI 25 [SD 4] kg/m(2)). The analyses were adjusted for age, sex, smoking status, and study. To limit reverse causality, the first 5 years of follow-up were excluded, leaving 66 552 deaths of known cause during a mean of 8 (SD 6) further years of follow-up (mean age at death 67 [SD 10] years): 30 416 vascular; 2070 diabetic, renal or hepatic; 22 592 neoplastic; 3770 respiratory; 7704 other.
RESULTS
In both sexes, mortality was lowest at about 22.5-25 kg/m(2). Above this range, positive associations were recorded for several specific causes and inverse associations for none, the absolute excess risks for higher BMI and smoking were roughly additive, and each 5 kg/m(2) higher BMI was on average associated with about 30% higher overall mortality (hazard ratio per 5 kg/m(2) [HR] 1.29 [95% CI 1.27-1.32]): 40% for vascular mortality (HR 1.41 [1.37-1.45]); 60-120% for diabetic, renal, and hepatic mortality (HRs 2.16 [1.89-2.46], 1.59 [1.27-1.99], and 1.82 [1.59-2.09], respectively); 10% for neoplastic mortality (HR 1.10 [1.06-1.15]); and 20% for respiratory and for all other mortality (HRs 1.20 [1.07-1.34] and 1.20 [1.16-1.25], respectively). Below the range 22.5-25 kg/m(2), BMI was associated inversely with overall mortality, mainly because of strong inverse associations with respiratory disease and lung cancer. These inverse associations were much stronger for smokers than for non-smokers, despite cigarette consumption per smoker varying little with BMI.
CONCLUSIONS
Although other anthropometric measures (eg, waist circumference, waist-to-hip ratio) could well add extra information to BMI, and BMI to them, BMI is in itself a strong predictor of overall mortality both above and below the apparent optimum of about 22.5-25 kg/m(2). The progressive excess mortality above this range is due mainly to vascular disease and is probably largely causal. At 30-35 kg/m(2), median survival is reduced by 2-4 years; at 40-45 kg/m(2), it is reduced by 8-10 years (which is comparable with the effects of smoking). The definite excess mortality below 22.5 kg/m(2) is due mainly to smoking-related diseases, and is not fully explained.
Publication
Journal: Journal of Structural Biology
March/20/2007
Abstract
EMAN is a scientific image processing package with a particular focus on single particle reconstruction from transmission electron microscopy (TEM) images. It was first released in 1999, and new versions have been released typically 2-3 times each year since that time. EMAN2 has been under development for the last two years, with a completely refactored image processing library, and a wide range of features to make it much more flexible and extensible than EMAN1. The user-level programs are better documented, more straightforward to use, and written in the Python scripting language, so advanced users can modify the programs' behavior without any recompilation. A completely rewritten 3D transformation class simplifies translation between Euler angle standards and symmetry conventions. The core C++ library has over 500 functions for image processing and associated tasks, and it is modular with introspection capabilities, so programmers can add new algorithms with minimal effort and programs can incorporate new capabilities automatically. Finally, a flexible new parallelism system has been designed to address the shortcomings in the rigid system in EMAN1.
Publication
Journal: Circulation
January/25/1979
Abstract
Four hundred M-mode echocardiographic surveys were distributed to determine interobserver variability in M-mode echocardiographic measurements. This was done with a view toward examining the need and determining the criteria for standardization of measurement. Each survey consisted of five M-mode echocardiograms with a calibration marker, measured by the survey participants anonymously. The echoes were judged of adequate quality for measurement of structures. Seventy-six of the 400 (19%) were returned, allowing comparison of interobserver variability as well as examination of the measurement criteria which were used. Mean measurements and percent uncertainty were derived for each structure for each criterion of measurement. For example, for the aorta, 33% of examiners measured the aorta as an outer/inner or leading edge dimension, and 20% measured it as an outer/outer dimension. The percent uncertainty for the measurement (1.97 SD divided by the mean) showed a mean of 13.8% for the 25 packets of five echoes measured using the former criteria and 24.2% using the latter criteria. For ventricular chamber and cavity measurements, almost one-half of the examiners used the peak of the QRS and one-half of the examiners used the onset of the QRS for determining end-diastole. Estimates of the percent of measurement uncertainty for the septum, posterior wall and left ventricular cavity dimension in this study were 10--25%. They were much higher (40--70%) for the right ventricular cavity and right ventricular anterior wall. The survey shows significant interobserver and interlaboratory variation in measurement when examining the same echoes and indicates a need for ongoing education, quality control and standardization of measurement criteria. Recommendations for new criteria for measurement of M-mode echocardiograms are offered.
Publication
Journal: The Lancet
October/13/2015
Abstract
BACKGROUND
Up-to-date evidence about levels and trends in disease and injury incidence, prevalence, and years lived with disability (YLDs) is an essential input into global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013), we estimated these quantities for acute and chronic diseases and injuries for 188 countries between 1990 and 2013.
METHODS
Estimates were calculated for disease and injury incidence, prevalence, and YLDs using GBD 2010 methods with some important refinements. Results for incidence of acute disorders and prevalence of chronic disorders are new additions to the analysis. Key improvements include expansion to the cause and sequelae list, updated systematic reviews, use of detailed injury codes, improvements to the Bayesian meta-regression method (DisMod-MR), and use of severity splits for various causes. An index of data representativeness, showing data availability, was calculated for each cause and impairment during three periods globally and at the country level for 2013. In total, 35 620 distinct sources of data were used and documented to calculated estimates for 301 diseases and injuries and 2337 sequelae. The comorbidity simulation provides estimates for the number of sequelae, concurrently, by individuals by country, year, age, and sex. Disability weights were updated with the addition of new population-based survey data from four countries.
RESULTS
Disease and injury were highly prevalent; only a small fraction of individuals had no sequelae. Comorbidity rose substantially with age and in absolute terms from 1990 to 2013. Incidence of acute sequelae were predominantly infectious diseases and short-term injuries, with over 2 billion cases of upper respiratory infections and diarrhoeal disease episodes in 2013, with the notable exception of tooth pain due to permanent caries with more than 200 million incident cases in 2013. Conversely, leading chronic sequelae were largely attributable to non-communicable diseases, with prevalence estimates for asymptomatic permanent caries and tension-type headache of 2·4 billion and 1·6 billion, respectively. The distribution of the number of sequelae in populations varied widely across regions, with an expected relation between age and disease prevalence. YLDs for both sexes increased from 537·6 million in 1990 to 764·8 million in 2013 due to population growth and ageing, whereas the age-standardised rate decreased little from 114·87 per 1000 people to 110·31 per 1000 people between 1990 and 2013. Leading causes of YLDs included low back pain and major depressive disorder among the top ten causes of YLDs in every country. YLD rates per person, by major cause groups, indicated the main drivers of increases were due to musculoskeletal, mental, and substance use disorders, neurological disorders, and chronic respiratory diseases; however HIV/AIDS was a notable driver of increasing YLDs in sub-Saharan Africa. Also, the proportion of disability-adjusted life years due to YLDs increased globally from 21·1% in 1990 to 31·2% in 2013.
CONCLUSIONS
Ageing of the world's population is leading to a substantial increase in the numbers of individuals with sequelae of diseases and injuries. Rates of YLDs are declining much more slowly than mortality rates. The non-fatal dimensions of disease and injury will require more and more attention from health systems. The transition to non-fatal outcomes as the dominant source of burden of disease is occurring rapidly outside of sub-Saharan Africa. Our results can guide future health initiatives through examination of epidemiological trends and a better understanding of variation across countries.
BACKGROUND
Bill & Melinda Gates Foundation.
Publication
Journal: Nature
June/23/2008
Abstract
The repair of wounds is one of the most complex biological processes that occur during human life. After an injury, multiple biological pathways immediately become activated and are synchronized to respond. In human adults, the wound repair process commonly leads to a non-functioning mass of fibrotic tissue known as a scar. By contrast, early in gestation, injured fetal tissues can be completely recreated, without fibrosis, in a process resembling regeneration. Some organisms, however, retain the ability to regenerate tissue throughout adult life. Knowledge gained from studying such organisms might help to unlock latent regenerative pathways in humans, which would change medical practice as much as the introduction of antibiotics did in the twentieth century.
load more...