WorldWideScience

Sample records for itraq-based shotgun quantitative

  1. Development of quantitative proteomics using iTRAQ based on the immunological response of Galleria mellonella larvae challenged with Fusarium oxysporum microconidia.

    Directory of Open Access Journals (Sweden)

    Amalia Muñoz-Gómez

    Full Text Available Galleria mellonella has emerged as a potential invertebrate model for scrutinizing innate immunity. Larvae are easy to handle in host-pathogen assays. We undertook proteomics research in order to understand immune response in a heterologous host when challenged with microconidia of Fusarium oxysporum. The aim of this study was to investigate hemolymph proteins that were differentially expressed between control and immunized larvae sets, tested with F. oxysporum at two temperatures. The iTRAQ approach allowed us to observe the effects of immune challenges in a lucid and robust manner, identifying more than 50 proteins, 17 of them probably involved in the immune response. Changes in protein expression were statistically significant, especially when temperature was increased because this was notoriously affected by F. oxysporum 104 or 106 microconidia/mL. Some proteins were up-regulated upon immune fungal microconidia challenge when temperature changed from 25 to 37°C. After analysis of identified proteins by bioinformatics and meta-analysis, results revealed that they were involved in transport, immune response, storage, oxide-reduction and catabolism: 20 from G. mellonella, 20 from the Lepidoptera species and 19 spread across bacteria, protista, fungi and animal species. Among these, 13 proteins and 2 peptides were examined for their immune expression, and the hypothetical 3D structures of 2 well-known proteins, unannotated for G. mellonella, i.e., actin and CREBP, were resolved using peptides matched with Bombyx mori and Danaus plexippus, respectively. The main conclusion in this study was that iTRAQ tool constitutes a consistent method to detect proteins associated with the innate immune system of G. mellonella in response to infection caused by F. oxysporum. In addition, iTRAQ was a reliable quantitative proteomic approach to detect and quantify the expression levels of immune system proteins and peptides, in particular, it was found that 104

  2. Global analysis of the yeast lipidome by quantitative shotgun mass spectrometry

    DEFF Research Database (Denmark)

    Ejsing, Christer S.; Sampaio, Julio L; Surendranath, Vineeth

    2009-01-01

    95% coverage of the yeast lipidome achieved with 125-fold improvement in sensitivity compared with previous approaches. Comparative lipidomics demonstrated that growth temperature and defects in lipid biosynthesis induce ripple effects throughout the molecular composition of the yeast lipidome....... This work serves as a resource for molecular characterization of eukaryotic lipidomes, and establishes shotgun lipidomics as a powerful platform for complementing biochemical studies and other systems-level approaches....

  3. A label-free quantitative shotgun proteomics analysis of rice grain development

    Directory of Open Access Journals (Sweden)

    Koh Hee-Jong

    2011-09-01

    Full Text Available Abstract Background Although a great deal of rice proteomic research has been conducted, there are relatively few studies specifically addressing the rice grain proteome. The existing rice grain proteomic researches have focused on the identification of differentially expressed proteins or monitoring protein expression patterns during grain filling stages. Results Proteins were extracted from rice grains 10, 20, and 30 days after flowering, as well as from fully mature grains. By merging all of the identified proteins in this study, we identified 4,172 non-redundant proteins with a wide range of molecular weights (from 5.2 kDa to 611 kDa and pI values (from pH 2.9 to pH 12.6. A Genome Ontology category enrichment analysis for the 4,172 proteins revealed that 52 categories were enriched, including the carbohydrate metabolic process, transport, localization, lipid metabolic process, and secondary metabolic process. The relative abundances of the 1,784 reproducibly identified proteins were compared to detect 484 differentially expressed proteins during rice grain development. Clustering analysis and Genome Ontology category enrichment analysis revealed that proteins involved in the metabolic process were enriched through all stages of development, suggesting that proteome changes occurred even in the desiccation phase. Interestingly, enrichments of proteins involved in protein folding were detected in the desiccation phase and in fully mature grain. Conclusion This is the first report conducting comprehensive identification of rice grain proteins. With a label free shotgun proteomic approach, we identified large number of rice grain proteins and compared the expression patterns of reproducibly identified proteins during rice grain development. Clustering analysis, Genome Ontology category enrichment analysis, and the analysis of composite expression profiles revealed dynamic changes of metabolisms during rice grain development. Interestingly, we

  4. iTRAQ based investigation of plasma proteins in HIV infected and HIV/HBV coinfected patients - C9 and KLK are related to HIV/HBV coinfection.

    Science.gov (United States)

    Sun, Tao; Liu, Li; Wu, Ao; Zhang, Yujiao; Jia, Xiaofang; Yin, Lin; Lu, Hongzhou; Zhang, Lijun

    2017-10-01

    Human immunodeficiency virus (HIV) and hepatitis B virus (HBV) share similar routes of transmission, and rapid progression of hepatic and immunodeficiency diseases has been observed in coinfected individuals. Our main objective was to investigate the molecular mechanism of HIV/HBV coinfections. We selected HIV infected and HIV/HBV coinfected patients with and without Highly Active Antiretroviral Therapy (HAART). Low abundance proteins enriched using a multiple affinity removal system (MARS) were labeled with isobaric tags for relative and absolute quantitation (iTRAQ) kits and analyzed using liquid chromatography-mass spectrometry (LC-MS). The differential proteins were analyzed by Gene Ontology (GO) database. A total of 41 differential proteins were found in HIV/HBV coinfected patients as compared to HIV mono-infected patients with or without HAART treatment, including 7 common HBV-regulated proteins. The proteins involved in complement and coagulation pathways were significantly enriched, including plasma kallikrein (KLK) and complement component C9 (C9). C9 and KLK were verified to be down-regulated in HIV/HBV coinfected patients through ELISA analysis. The present iTRAQ based proteomic analyses identified 7 proteins that are related to HIV/HBV coinfection. HBV might influence hepatic and immune functions by deregulating complement and coagulation pathways. C9 and KLK could potentially be used as targets for the treatment of HIV/HBV coinfections. Copyright © 2017. Published by Elsevier Ltd.

  5. Shotgun protein sequencing.

    Energy Technology Data Exchange (ETDEWEB)

    Faulon, Jean-Loup Michel; Heffelfinger, Grant S.

    2009-06-01

    A novel experimental and computational technique based on multiple enzymatic digestion of a protein or protein mixture that reconstructs protein sequences from sequences of overlapping peptides is described in this SAND report. This approach, analogous to shotgun sequencing of DNA, is to be used to sequence alternative spliced proteins, to identify post-translational modifications, and to sequence genetically engineered proteins.

  6. Shotgun Proteomics and Biomarker Discovery

    Directory of Open Access Journals (Sweden)

    W. Hayes McDonald

    2002-01-01

    Full Text Available Coupling large-scale sequencing projects with the amino acid sequence information that can be gleaned from tandem mass spectrometry (MS/MS has made it much easier to analyze complex mixtures of proteins. The limits of this “shotgun” approach, in which the protein mixture is proteolytically digested before separation, can be further expanded by separating the resulting mixture of peptides prior to MS/MS analysis. Both single dimensional high pressure liquid chromatography (LC and multidimensional LC (LC/LC can be directly interfaced with the mass spectrometer to allow for automated collection of tremendous quantities of data. While there is no single technique that addresses all proteomic challenges, the shotgun approaches, especially LC/LC-MS/MS-based techniques such as MudPIT (multidimensional protein identification technology, show advantages over gel-based techniques in speed, sensitivity, scope of analysis, and dynamic range. Advances in the ability to quantitate differences between samples and to detect for an array of post-translational modifications allow for the discovery of classes of protein biomarkers that were previously unassailable.

  7. Automation of dimethylation after guanidination labeling chemistry and its compatibility with common buffers and surfactants for mass spectrometry-based shotgun quantitative proteome analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang, E-mail: Liang.Li@ualberta.ca

    2013-07-25

    Graphical abstract: -- Highlights: •Dimethylation after guanidination (2MEGA) uses inexpensive reagents for isotopic labeling of peptides. •2MEGA can be optimized and automated for labeling peptides with high efficiency. •2MEGA is compatible with several commonly used cell lysis and protein solubilization reagents. •The automated 2MEGA labeling method can be used to handle a variety of protein samples for relative proteome quantification. -- Abstract: Isotope labeling liquid chromatography–mass spectrometry (LC–MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis.

  8. Suicide with Shotgun: A Case Report

    Directory of Open Access Journals (Sweden)

    Ali Yildirim

    2011-03-01

    Full Text Available Suicide appears to be a major public health problem in our country and all over the World. Suicide methods will vary between the various communities the most common types of suicides are hanging, using chemicals and using firearms (pistol, shotgun. Connected with easy availability of shotguns suicide cases with using shotgun is significantly increasing in recent years. In our study, suicide with a shotgun, are evaluated in terms of shooting range and its features, originate, area of suicide, crime scene, sex and age. [J Contemp Med 2011; 1(1.000: 29-34

  9. "Shotgunning" as an illicit drug smoking practice.

    Science.gov (United States)

    Perlman, D C; Perkins, M P; Paone, D; Kochems, L; Salomon, N; Friedmann, P; Des Jarlais, D C

    1997-01-01

    There has been a rise in illicit drug smoking in the United States. "Shotgunning" drugs (or "doing a shotgun") refers to the practice of inhaling smoke and then exhaling it into another individual's mouth, a practice with the potential for the efficient transmission of respiratory pathogens. Three hundred fifty-four drug users (239 from a syringe exchange and 115 from a drug detoxification program) were interviewed about shotgunning and screened for tuberculosis (TB). Fifty-nine (17%; 95% CI 12.9%-20.9%) reported shotgunning while smoking crack cocaine (68%), marijuana (41%), or heroin (2%). In multivariate analysis, age alcohol to intoxication (OR 2.2, 95% CI 1.1-4.3), having engaged in high-risk sex (OR 2.6, 95% CI 1.04-6.7), and crack use (OR 6.0, 95% CI 3.0-12) were independently associated with shotgunning. Shotgunning is a frequent drug smoking practice with the potential to transmit respiratory pathogens, underscoring the need for education of drug users about the risks of specific drug use practices, and the ongoing need for TB control among active drug users.

  10. Shotgun lipidomic analysis of chemically sulfated sterols compromises analytical sensitivity

    DEFF Research Database (Denmark)

    Casanovas, Albert; Hannibal-Bach, Hans Kristian; Jensen, Ole Nørregaard

    2014-01-01

    Shotgun lipidomics affords comprehensive and quantitative analysis of lipid species in cells and tissues at high-throughput [1 5]. The methodology is based on direct infusion of lipid extracts by electrospray ionization (ESI) combined with tandem mass spectrometry (MS/MS) and/or high resolution F...... low ionization efficiency in ESI [7]. For this reason, chemical derivatization procedures including acetylation [8] or sulfation [9] are commonly implemented to facilitate ionization, detection and quantification of sterols for global lipidome analysis [1-3, 10]....

  11. High-throughput shotgun lipidomics by quadrupole time-of-flight mass spectrometry

    DEFF Research Database (Denmark)

    Ståhlman, Marcus; Ejsing, Christer S.; Tarasov, Kirill

    2009-01-01

    Technological advances in mass spectrometry and meticulous method development have produced several shotgun lipidomic approaches capable of characterizing lipid species by direct analysis of total lipid extracts. Shotgun lipidomics by hybrid quadrupole time-of-flight mass spectrometry allows...... the absolute quantification of hundreds of molecular glycerophospholipid species, glycerolipid species, sphingolipid species and sterol lipids. Future applications in clinical cohort studies demand detailed lipid molecule information and the application of high-throughput lipidomics platforms. In this review...... we describe a novel high-throughput shotgun lipidomic platform based on 96-well robot-assisted lipid extraction, automated sample infusion by mircofluidic-based nanoelectrospray ionization, and quantitative multiple precursor ion scanning analysis on a quadrupole time-of-flight mass spectrometer...

  12. Shotgun approaches to gait analysis : insights & limitations

    NARCIS (Netherlands)

    Kaptein, Ronald G.; Wezenberg, Daphne; IJmker, Trienke; Houdijk, Han; Beek, Peter J.; Lamoth, Claudine J. C.; Daffertshofer, Andreas

    2014-01-01

    Background: Identifying features for gait classification is a formidable problem. The number of candidate measures is legion. This calls for proper, objective criteria when ranking their relevance. Methods: Following a shotgun approach we determined a plenitude of kinematic and physiological gait

  13. Shotgun metagenomic data streams: surfing without fear

    Energy Technology Data Exchange (ETDEWEB)

    Berendzen, Joel R [Los Alamos National Laboratory

    2010-12-06

    Timely information about bio-threat prevalence, consequence, propagation, attribution, and mitigation is needed to support decision-making, both routinely and in a crisis. One DNA sequencer can stream 25 Gbp of information per day, but sampling strategies and analysis techniques are needed to turn raw sequencing power into actionable knowledge. Shotgun metagenomics can enable biosurveillance at the level of a single city, hospital, or airplane. Metagenomics characterizes viruses and bacteria from complex environments such as soil, air filters, or sewage. Unlike targeted-primer-based sequencing, shotgun methods are not blind to sequences that are truly novel, and they can measure absolute prevalence. Shotgun metagenomic sampling can be non-invasive, efficient, and inexpensive while being informative. We have developed analysis techniques for shotgun metagenomic sequencing that rely upon phylogenetic signature patterns. They work by indexing local sequence patterns in a manner similar to web search engines. Our methods are laptop-fast and favorable scaling properties ensure they will be sustainable as sequencing methods grow. We show examples of application to soil metagenomic samples.

  14. MULTI-DIMENSIONAL MASS SPECTROMETRY-BASED SHOTGUN LIPIDOMICS AND NOVEL STRATEGIES FOR LIPIDOMIC ANALYSES

    Science.gov (United States)

    Han, Xianlin; Yang, Kui; Gross, Richard W.

    2011-01-01

    Since our last comprehensive review on multi-dimensional mass spectrometry-based shotgun lipidomics (Mass Spectrom. Rev. 24 (2005), 367), many new developments in the field of lipidomics have occurred. These developments include new strategies and refinements for shotgun lipidomic approaches that use direct infusion, including novel fragmentation strategies, identification of multiple new informative dimensions for mass spectrometric interrogation, and the development of new bioinformatic approaches for enhanced identification and quantitation of the individual molecular constituents that comprise each cell’s lipidome. Concurrently, advances in liquid chromatography-based platforms and novel strategies for quantitative matrix-assisted laser desorption/ionization mass spectrometry for lipidomic analyses have been developed. Through the synergistic use of this repertoire of new mass spectrometric approaches, the power and scope of lipidomics has been greatly expanded to accelerate progress toward the comprehensive understanding of the pleiotropic roles of lipids in biological systems. PMID:21755525

  15. Entrance, exit, and reentrance of one shot with a shotgun

    DEFF Research Database (Denmark)

    Gulmann, C; Hougen, H P

    1999-01-01

    The case being reported is one of a homicidal shotgun fatality with an unusual wound pattern. A 34-year-old man was shot at close range with a 12-gauge shotgun armed with No. 5 birdshot ammunition. The shot entered the left axillary region, exited through the left infraclavicular region, and ther......The case being reported is one of a homicidal shotgun fatality with an unusual wound pattern. A 34-year-old man was shot at close range with a 12-gauge shotgun armed with No. 5 birdshot ammunition. The shot entered the left axillary region, exited through the left infraclavicular region...

  16. GO Explorer: A gene-ontology tool to aid in the interpretation of shotgun proteomics data

    Directory of Open Access Journals (Sweden)

    Domont Gilberto B

    2009-02-01

    Full Text Available Abstract Background Spectral counting is a shotgun proteomics approach comprising the identification and relative quantitation of thousands of proteins in complex mixtures. However, this strategy generates bewildering amounts of data whose biological interpretation is a challenge. Results Here we present a new algorithm, termed GO Explorer (GOEx, that leverages the gene ontology (GO to aid in the interpretation of proteomic data. GOEx stands out because it combines data from protein fold changes with GO over-representation statistics to help draw conclusions. Moreover, it is tightly integrated within the PatternLab for Proteomics project and, thus, lies within a complete computational environment that provides parsers and pattern recognition tools designed for spectral counting. GOEx offers three independent methods to query data: an interactive directed acyclic graph, a specialist mode where key words can be searched, and an automatic search. Its usefulness is demonstrated by applying it to help interpret the effects of perillyl alcohol, a natural chemotherapeutic agent, on glioblastoma multiform cell lines (A172. We used a new multi-surfactant shotgun proteomic strategy and identified more than 2600 proteins; GOEx pinpointed key sets of differentially expressed proteins related to cell cycle, alcohol catabolism, the Ras pathway, apoptosis, and stress response, to name a few. Conclusion GOEx facilitates organism-specific studies by leveraging GO and providing a rich graphical user interface. It is a simple to use tool, specialized for biologists who wish to analyze spectral counting data from shotgun proteomics. GOEx is available at http://pcarvalho.com/patternlab.

  17. GO Explorer: A gene-ontology tool to aid in the interpretation of shotgun proteomics data.

    Science.gov (United States)

    Carvalho, Paulo C; Fischer, Juliana Sg; Chen, Emily I; Domont, Gilberto B; Carvalho, Maria Gc; Degrave, Wim M; Yates, John R; Barbosa, Valmir C

    2009-02-24

    Spectral counting is a shotgun proteomics approach comprising the identification and relative quantitation of thousands of proteins in complex mixtures. However, this strategy generates bewildering amounts of data whose biological interpretation is a challenge. Here we present a new algorithm, termed GO Explorer (GOEx), that leverages the gene ontology (GO) to aid in the interpretation of proteomic data. GOEx stands out because it combines data from protein fold changes with GO over-representation statistics to help draw conclusions. Moreover, it is tightly integrated within the PatternLab for Proteomics project and, thus, lies within a complete computational environment that provides parsers and pattern recognition tools designed for spectral counting. GOEx offers three independent methods to query data: an interactive directed acyclic graph, a specialist mode where key words can be searched, and an automatic search. Its usefulness is demonstrated by applying it to help interpret the effects of perillyl alcohol, a natural chemotherapeutic agent, on glioblastoma multiform cell lines (A172). We used a new multi-surfactant shotgun proteomic strategy and identified more than 2600 proteins; GOEx pinpointed key sets of differentially expressed proteins related to cell cycle, alcohol catabolism, the Ras pathway, apoptosis, and stress response, to name a few. GOEx facilitates organism-specific studies by leveraging GO and providing a rich graphical user interface. It is a simple to use tool, specialized for biologists who wish to analyze spectral counting data from shotgun proteomics. GOEx is available at http://pcarvalho.com/patternlab.

  18. Shotgun pyrosequencing metagenomic analyses of dusts from swine confinement and grain facilities.

    Science.gov (United States)

    Boissy, Robert J; Romberger, Debra J; Roughead, William A; Weissenburger-Moser, Lisa; Poole, Jill A; LeVan, Tricia D

    2014-01-01

    Inhalation of agricultural dusts causes inflammatory reactions and symptoms such as headache, fever, and malaise, which can progress to chronic airway inflammation and associated diseases, e.g. asthma, chronic bronchitis, chronic obstructive pulmonary disease, and hypersensitivity pneumonitis. Although in many agricultural environments feed particles are the major constituent of these dusts, the inflammatory responses that they provoke are likely attributable to particle-associated bacteria, archaebacteria, fungi, and viruses. In this study, we performed shotgun pyrosequencing metagenomic analyses of DNA from dusts from swine confinement facilities or grain elevators, with comparisons to dusts from pet-free households. DNA sequence alignment showed that 19% or 62% of shotgun pyrosequencing metagenomic DNA sequence reads from swine facility or household dusts, respectively, were of swine or human origin, respectively. In contrast only 2% of such reads from grain elevator dust were of mammalian origin. These metagenomic shotgun reads of mammalian origin were excluded from our analyses of agricultural dust microbiota. The ten most prevalent bacterial taxa identified in swine facility compared to grain elevator or household dust were comprised of 75%, 16%, and 42% gram-positive organisms, respectively. Four of the top five swine facility dust genera were assignable (Clostridium, Lactobacillus, Ruminococcus, and Eubacterium, ranging from 4% to 19% relative abundance). The relative abundances of these four genera were lower in dust from grain elevators or pet-free households. These analyses also highlighted the predominance in swine facility dust of Firmicutes (70%) at the phylum level, Clostridia (44%) at the Class level, and Clostridiales at the Order level (41%). In summary, shotgun pyrosequencing metagenomic analyses of agricultural dusts show that they differ qualitatively and quantitatively at the level of microbial taxa present, and that the bioinformatic analyses

  19. Shotgun pyrosequencing metagenomic analyses of dusts from swine confinement and grain facilities.

    Directory of Open Access Journals (Sweden)

    Robert J Boissy

    Full Text Available Inhalation of agricultural dusts causes inflammatory reactions and symptoms such as headache, fever, and malaise, which can progress to chronic airway inflammation and associated diseases, e.g. asthma, chronic bronchitis, chronic obstructive pulmonary disease, and hypersensitivity pneumonitis. Although in many agricultural environments feed particles are the major constituent of these dusts, the inflammatory responses that they provoke are likely attributable to particle-associated bacteria, archaebacteria, fungi, and viruses. In this study, we performed shotgun pyrosequencing metagenomic analyses of DNA from dusts from swine confinement facilities or grain elevators, with comparisons to dusts from pet-free households. DNA sequence alignment showed that 19% or 62% of shotgun pyrosequencing metagenomic DNA sequence reads from swine facility or household dusts, respectively, were of swine or human origin, respectively. In contrast only 2% of such reads from grain elevator dust were of mammalian origin. These metagenomic shotgun reads of mammalian origin were excluded from our analyses of agricultural dust microbiota. The ten most prevalent bacterial taxa identified in swine facility compared to grain elevator or household dust were comprised of 75%, 16%, and 42% gram-positive organisms, respectively. Four of the top five swine facility dust genera were assignable (Clostridium, Lactobacillus, Ruminococcus, and Eubacterium, ranging from 4% to 19% relative abundance. The relative abundances of these four genera were lower in dust from grain elevators or pet-free households. These analyses also highlighted the predominance in swine facility dust of Firmicutes (70% at the phylum level, Clostridia (44% at the Class level, and Clostridiales at the Order level (41%. In summary, shotgun pyrosequencing metagenomic analyses of agricultural dusts show that they differ qualitatively and quantitatively at the level of microbial taxa present, and that the

  20. OTU analysis using metagenomic shotgun sequencing data.

    Directory of Open Access Journals (Sweden)

    Xiaolin Hao

    Full Text Available Because of technological limitations, the primer and amplification biases in targeted sequencing of 16S rRNA genes have veiled the true microbial diversity underlying environmental samples. However, the protocol of metagenomic shotgun sequencing provides 16S rRNA gene fragment data with natural immunity against the biases raised during priming and thus the potential of uncovering the true structure of microbial community by giving more accurate predictions of operational taxonomic units (OTUs. Nonetheless, the lack of statistically rigorous comparison between 16S rRNA gene fragments and other data types makes it difficult to interpret previously reported results using 16S rRNA gene fragments. Therefore, in the present work, we established a standard analysis pipeline that would help confirm if the differences in the data are true or are just due to potential technical bias. This pipeline is built by using simulated data to find optimal mapping and OTU prediction methods. The comparison between simulated datasets revealed a relationship between 16S rRNA gene fragments and full-length 16S rRNA sequences that a 16S rRNA gene fragment having a length >150 bp provides the same accuracy as a full-length 16S rRNA sequence using our proposed pipeline, which could serve as a good starting point for experimental design and making the comparison between 16S rRNA gene fragment-based and targeted 16S rRNA sequencing-based surveys possible.

  1. Novel advances in shotgun lipidomics for biology and medicine.

    Science.gov (United States)

    Wang, Miao; Wang, Chunyan; Han, Rowland H; Han, Xianlin

    2016-01-01

    The field of lipidomics, as coined in 2003, has made profound advances and been rapidly expanded. The mass spectrometry-based strategies of this analytical methodology-oriented research discipline for lipid analysis are largely fallen into three categories: direct infusion-based shotgun lipidomics, liquid chromatography-mass spectrometry-based platforms, and matrix-assisted laser desorption/ionization mass spectrometry-based approaches (particularly in imagining lipid distribution in tissues or cells). This review focuses on shotgun lipidomics. After briefly introducing its fundamentals, the major materials of this article cover its recent advances. These include the novel methods of lipid extraction, novel shotgun lipidomics strategies for identification and quantification of previously hardly accessible lipid classes and molecular species including isomers, and novel tools for processing and interpretation of lipidomics data. Representative applications of advanced shotgun lipidomics for biological and biomedical research are also presented in this review. We believe that with these novel advances in shotgun lipidomics, this approach for lipid analysis should become more comprehensive and high throughput, thereby greatly accelerating the lipidomics field to substantiate the aberrant lipid metabolism, signaling, trafficking, and homeostasis under pathological conditions and their underpinning biochemical mechanisms. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Choosing the best plant for the job: a cost-effective assay to prescreen ancient plant remains destined for shotgun sequencing.

    Directory of Open Access Journals (Sweden)

    Nathan Wales

    Full Text Available DNA extracted from ancient plant remains almost always contains a mixture of endogenous (that is, derived from the plant and exogenous (derived from other sources DNA. The exogenous 'contaminant' DNA, chiefly derived from microorganisms, presents significant problems for shotgun sequencing. In some samples, more than 90% of the recovered sequences are exogenous, providing limited data relevant to the sample. However, other samples have far less contamination and subsequently yield much more useful data via shotgun sequencing. Given the investment required for high-throughput sequencing, whenever multiple samples are available, it is most economical to sequence the least contaminated sample. We present an assay based on quantitative real-time PCR which estimates the relative amounts of fungal and bacterial DNA in a sample in comparison to the endogenous plant DNA. Given a collection of contextually-similar ancient plant samples, this low cost assay aids in selecting the best sample for shotgun sequencing.

  3. Whole genome shotgun sequencing of Indian strains of Streptococcus agalactiae

    Directory of Open Access Journals (Sweden)

    Balaji Veeraraghavan

    2017-12-01

    Full Text Available Group B streptococcus is known as a leading cause of neonatal infections in developing countries. The present study describes the whole genome shotgun sequences of four Group B Streptococcus (GBS isolates. Molecular data on clonality is lacking for GBS in India. The present genome report will add important information on the scarce genome data of GBS and will help in deriving comparative genome studies of GBS isolates at global level. This Whole Genome Shotgun project has been deposited at DDBJ/ENA/GenBank under the accession numbers NHPL00000000 – NHPO00000000.

  4. Analysis of pig serum proteins based on shotgun liquid ...

    African Journals Online (AJOL)

    Recent advances in proteomics technologies have opened up significant opportunities for future applications. We used shotgun liquid chromatography, coupled with tandem mass spectrometry (LC-MS/MS) to determine the proteome profile of healthy pig serum. Samples of venous blood were collected and subjected to ...

  5. Defining Diagnostic Biomarkers Using Shotgun Proteomics and MALDI-TOF Mass Spectrometry.

    Science.gov (United States)

    Armengaud, Jean

    2017-01-01

    Whole-cell MALDI-TOF has become a robust and widely used tool to quickly identify any pathogen. In addition to being routinely used in hospitals, it is also useful for low cost dereplication in large scale screening procedures of new environmental isolates for environmental biotechnology or taxonomical applications. Here, I describe how specific biomarkers can be defined using shotgun proteomics and whole-cell MALDI-TOF mass spectrometry. Based on MALDI-TOF spectra recorded on a given set of pathogens with internal calibrants, m/z values of interest are extracted. The proteins which contribute to these peaks are deduced from label-free shotgun proteomics measurements carried out on the same sample. Quantitative information based on the spectral count approach allows ranking the most probable candidates. Proteogenomic approaches help to define whether these proteins give the same m/z values along the whole taxon under consideration or result in heterogeneous lists. These specific biomarkers nicely complement conventional profiling approaches and may help to better define groups of organisms, for example at the subspecies level.

  6. Bioinformatics for whole-genome shotgun sequencing of microbial communities.

    Directory of Open Access Journals (Sweden)

    Kevin Chen

    2005-07-01

    Full Text Available The application of whole-genome shotgun sequencing to microbial communities represents a major development in metagenomics, the study of uncultured microbes via the tools of modern genomic analysis. In the past year, whole-genome shotgun sequencing projects of prokaryotic communities from an acid mine biofilm, the Sargasso Sea, Minnesota farm soil, three deep-sea whale falls, and deep-sea sediments have been reported, adding to previously published work on viral communities from marine and fecal samples. The interpretation of this new kind of data poses a wide variety of exciting and difficult bioinformatics problems. The aim of this review is to introduce the bioinformatics community to this emerging field by surveying existing techniques and promising new approaches for several of the most interesting of these computational problems.

  7. Microbial Community Profiling of Human Saliva Using Shotgun Metagenomic Sequencing

    OpenAIRE

    Hasan, Nur A.; Young, Brian A.; Minard-Smith, Angela T.; Saeed, Kelly; Li, Huai; Heizer, Esley M.; McMillan, Nancy J.; Isom, Richard; Abdullah, Abdul Shakur; Bornman, Daniel M.; Faith, Seth A.; Choi, Seon Young; Dickens, Michael L.; Cebula, Thomas A.; Colwell, Rita R.

    2014-01-01

    Human saliva is clinically informative of both oral and general health. Since next generation shotgun sequencing (NGS) is now widely used to identify and quantify bacteria, we investigated the bacterial flora of saliva microbiomes of two healthy volunteers and five datasets from the Human Microbiome Project, along with a control dataset containing short NGS reads from bacterial species representative of the bacterial flora of human saliva. GENIUS, a system designed to identify and quantify ba...

  8. WGSQuikr: fast whole-genome shotgun metagenomic classification.

    Directory of Open Access Journals (Sweden)

    David Koslicki

    Full Text Available With the decrease in cost and increase in output of whole-genome shotgun technologies, many metagenomic studies are utilizing this approach in lieu of the more traditional 16S rRNA amplicon technique. Due to the large number of relatively short reads output from whole-genome shotgun technologies, there is a need for fast and accurate short-read OTU classifiers. While there are relatively fast and accurate algorithms available, such as MetaPhlAn, MetaPhyler, PhyloPythiaS, and PhymmBL, these algorithms still classify samples in a read-by-read fashion and so execution times can range from hours to days on large datasets. We introduce WGSQuikr, a reconstruction method which can compute a vector of taxonomic assignments and their proportions in the sample with remarkable speed and accuracy. We demonstrate on simulated data that WGSQuikr is typically more accurate and up to an order of magnitude faster than the aforementioned classification algorithms. We also verify the utility of WGSQuikr on real biological data in the form of a mock community. WGSQuikr is a Whole-Genome Shotgun QUadratic, Iterative, K-mer based Reconstruction method which extends the previously introduced 16S rRNA-based algorithm Quikr. A MATLAB implementation of WGSQuikr is available at: http://sourceforge.net/projects/wgsquikr.

  9. PAnalyzer: A software tool for protein inference in shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Prieto Gorka

    2012-11-01

    Full Text Available Abstract Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA approaches have emerged as an alternative to the traditional data dependent acquisition (DDA in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates

  10. Identification of meat products by shotgun spectral matching

    DEFF Research Database (Denmark)

    Ohana, D.; Dalebout, H.; Marissen, R. J.

    2016-01-01

    A new method, based on shotgun spectral matching of peptide tandem mass spectra, was successfully applied to the identification of different food species. The method was demonstrated to work on raw as well as processed samples from 16 mammalian and 10 bird species by counting spectral matches...... to spectral libraries in a reference database with one spectral library per species. A phylogenetic tree could also be constructed directly from the spectra. Nearly all samples could be correctly identified at the species level, and 100% at the genus level. The method does not use any genomic information...

  11. Comparative study of label and label-free techniques using shotgun proteomics for relative protein quantification.

    Science.gov (United States)

    Sjödin, Marcus O D; Wetterhall, Magnus; Kultima, Kim; Artemenko, Konstantin

    2013-06-01

    The analytical performance of three different strategies, iTRAQ (isobaric tag for relative and absolute quantification), dimethyl labeling (DML) and label free (LF) for relative protein quantification using shotgun proteomics have been evaluated. The methods have been explored using samples containing (i) Bovine proteins in known ratios and (ii) Bovine proteins in known ratios spiked into Escherichia coli. The latter case mimics the actual conditions in a typical biological sample with a few differentially expressed proteins and a bulk of proteins with unchanged ratios. Additionally, the evaluation was performed on both QStar and LTQ-FTICR mass spectrometers. LF LTQ-FTICR was found to have the highest proteome coverage while the highest accuracy based on the artificially regulated proteins was found for DML LTQ-FTICR (54%). A varying linearity (k: 0.55-1.16, r(2): 0.61-0.96) was shown for all methods within selected dynamic ranges. All methods were found to consistently underestimate Bovine protein ratios when matrix proteins were added. However, LF LTQ-FTICR was more tolerant toward a compression effect. A single peptide was demonstrated to be sufficient for a reliable quantification using iTRAQ. A ranking system utilizing several parameters important for quantitative proteomics demonstrated that the overall performance of the five different methods was; DML LTQ-FTICR>iTRAQ QStar>LF LTQ-FTICR>DML QStar>LF QStar. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. PatternLab for proteomics: a tool for differential shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Yates John R

    2008-07-01

    Full Text Available Abstract Background A goal of proteomics is to distinguish between states of a biological system by identifying protein expression differences. Liu et al. demonstrated a method to perform semi-relative protein quantitation in shotgun proteomics data by correlating the number of tandem mass spectra obtained for each protein, or "spectral count", with its abundance in a mixture; however, two issues have remained open: how to normalize spectral counting data and how to efficiently pinpoint differences between profiles. Moreover, Chen et al. recently showed how to increase the number of identified proteins in shotgun proteomics by analyzing samples with different MS-compatible detergents while performing proteolytic digestion. The latter introduced new challenges as seen from the data analysis perspective, since replicate readings are not acquired. Results To address the open issues above, we present a program termed PatternLab for proteomics. This program implements existing strategies and adds two new methods to pinpoint differences in protein profiles. The first method, ACFold, addresses experiments with less than three replicates from each state or having assays acquired by different protocols as described by Chen et al. ACFold uses a combined criterion based on expression fold changes, the AC test, and the false-discovery rate, and can supply a "bird's-eye view" of differentially expressed proteins. The other method addresses experimental designs having multiple readings from each state and is referred to as nSVM (natural support vector machine because of its roots in evolutionary computing and in statistical learning theory. Our observations suggest that nSVM's niche comprises projects that select a minimum set of proteins for classification purposes; for example, the development of an early detection kit for a given pathology. We demonstrate the effectiveness of each method on experimental data and confront them with existing strategies

  13. Using Growing Self-Organising Maps to Improve the Binning Process in Environmental Whole-Genome Shotgun Sequencing

    Science.gov (United States)

    Chan, Chon-Kit Kenneth; Hsu, Arthur L.; Tang, Sen-Lin; Halgamuge, Saman K.

    2008-01-01

    Metagenomic projects using whole-genome shotgun (WGS) sequencing produces many unassembled DNA sequences and small contigs. The step of clustering these sequences, based on biological and molecular features, is called binning. A reported strategy for binning that combines oligonucleotide frequency and self-organising maps (SOM) shows high potential. We improve this strategy by identifying suitable training features, implementing a better clustering algorithm, and defining quantitative measures for assessing results. We investigated the suitability of each of di-, tri-, tetra-, and pentanucleotide frequencies. The results show that dinucleotide frequency is not a sufficiently strong signature for binning 10 kb long DNA sequences, compared to the other three. Furthermore, we observed that increased order of oligonucleotide frequency may deteriorate the assignment result in some cases, which indicates the possible existence of optimal species-specific oligonucleotide frequency. We replaced SOM with growing self-organising map (GSOM) where comparable results are obtained while gaining 7%–15% speed improvement. PMID:18288261

  14. Microbial community profiling of human saliva using shotgun metagenomic sequencing.

    Directory of Open Access Journals (Sweden)

    Nur A Hasan

    Full Text Available Human saliva is clinically informative of both oral and general health. Since next generation shotgun sequencing (NGS is now widely used to identify and quantify bacteria, we investigated the bacterial flora of saliva microbiomes of two healthy volunteers and five datasets from the Human Microbiome Project, along with a control dataset containing short NGS reads from bacterial species representative of the bacterial flora of human saliva. GENIUS, a system designed to identify and quantify bacterial species using unassembled short NGS reads was used to identify the bacterial species comprising the microbiomes of the saliva samples and datasets. Results, achieved within minutes and at greater than 90% accuracy, showed more than 175 bacterial species comprised the bacterial flora of human saliva, including bacteria known to be commensal human flora but also Haemophilus influenzae, Neisseria meningitidis, Streptococcus pneumoniae, and Gamma proteobacteria. Basic Local Alignment Search Tool (BLASTn analysis in parallel, reported ca. five times more species than those actually comprising the in silico sample. Both GENIUS and BLAST analyses of saliva samples identified major genera comprising the bacterial flora of saliva, but GENIUS provided a more precise description of species composition, identifying to strain in most cases and delivered results at least 10,000 times faster. Therefore, GENIUS offers a facile and accurate system for identification and quantification of bacterial species and/or strains in metagenomic samples.

  15. Genomic V exons from whole genome shotgun data in reptiles.

    Science.gov (United States)

    Olivieri, D N; von Haeften, B; Sánchez-Espinel, C; Faro, J; Gambón-Deza, F

    2014-08-01

    Reptiles and mammals diverged over 300 million years ago, creating two parallel evolutionary lineages amongst terrestrial vertebrates. In reptiles, two main evolutionary lines emerged: one gave rise to Squamata, while the other gave rise to Testudines, Crocodylia, and Aves. In this study, we determined the genomic variable (V) exons from whole genome shotgun sequencing (WGS) data in reptiles corresponding to the three main immunoglobulin (IG) loci and the four main T cell receptor (TR) loci. We show that Squamata lack the TRG and TRD genes, and snakes lack the IGKV genes. In representative species of Testudines and Crocodylia, the seven major IG and TR loci are maintained. As in mammals, genes of the IG loci can be grouped into well-defined IMGT clans through a multi-species phylogenetic analysis. We show that the reptilian IGHV and IGLV genes are distributed amongst the established mammalian clans, while their IGKV genes are found within a single clan, nearly exclusive from the mammalian sequences. The reptilian and mammalian TRAV genes cluster into six common evolutionary clades (since IMGT clans have not been defined for TR). In contrast, the reptilian TRBV genes cluster into three clades, which have few mammalian members. In this locus, the V exon sequences from mammals appear to have undergone different evolutionary diversification processes that occurred outside these shared reptilian clans. These sequences can be obtained in a freely available public repository (http://vgenerepertoire.org).

  16. Shotgun metaproteomics of the human distal gut microbiota

    Energy Technology Data Exchange (ETDEWEB)

    VerBerkmoes, N.C.; Russell, A.L.; Shah, M.; Godzik, A.; Rosenquist, M.; Halfvarsson, J.; Lefsrud, M.G.; Apajalahti, J.; Tysk, C.; Hettich, R.L.; Jansson, Janet K.

    2008-10-15

    The human gut contains a dense, complex and diverse microbial community, comprising the gut microbiome. Metagenomics has recently revealed the composition of genes in the gut microbiome, but provides no direct information about which genes are expressed or functioning. Therefore, our goal was to develop a novel approach to directly identify microbial proteins in fecal samples to gain information about the genes expressed and about key microbial functions in the human gut. We used a non-targeted, shotgun mass spectrometry-based whole community proteomics, or metaproteomics, approach for the first deep proteome measurements of thousands of proteins in human fecal samples, thus demonstrating this approach on the most complex sample type to date. The resulting metaproteomes had a skewed distribution relative to the metagenome, with more proteins for translation, energy production and carbohydrate metabolism when compared to what was earlier predicted from metagenomics. Human proteins, including antimicrobial peptides, were also identified, providing a non-targeted glimpse of the host response to the microbiota. Several unknown proteins represented previously undescribed microbial pathways or host immune responses, revealing a novel complex interplay between the human host and its associated microbes.

  17. Comparative shotgun proteomic analysis of wild and domesticated Opuntia spp. species shows a metabolic adaptation through domestication.

    Science.gov (United States)

    Pichereaux, Carole; Hernández-Domínguez, Eric E; Santos-Diaz, Maria Del Socorro; Reyes-Agüero, Antonio; Astello-García, Marizel; Guéraud, Françoise; Negre-Salvayre, Anne; Schiltz, Odile; Rossignol, Michel; Barba de la Rosa, Ana Paulina

    2016-06-30

    The Opuntia genus is widely distributed in America, but the highest richness of wild species are found in Mexico, as well as the most domesticated Opuntia ficus-indica, which is the most domesticated species and an important crop in agricultural economies of arid and semiarid areas worldwide. During domestication process, the Opuntia morphological characteristics were favored, such as less and smaller spines in cladodes and less seeds in fruits, but changes at molecular level are almost unknown. To obtain more insights about the Opuntia molecular changes through domestication, a shotgun proteomic analysis and database-dependent searches by homology was carried out. >1000 protein species were identified and by using a label-free quantitation method, the Opuntia proteomes were compared in order to identify differentially accumulated proteins among wild and domesticated species. Most of the changes were observed in glucose, secondary, and 1C metabolism, which correlate with the observed protein, fiber and phenolic compounds accumulation in Opuntia cladodes. Regulatory proteins, ribosomal proteins, and proteins related with response to stress were also observed in differential accumulation. These results provide new valuable data that will help to the understanding of the molecular changes of Opuntia species through domestication. Opuntia species are well adapted to dry and warm conditions in arid and semiarid regions worldwide, and they are highly productive plants showing considerable promises as an alternative food source. However, there is a gap regarding Opuntia molecular mechanisms that enable them to grow in extreme environmental conditions and how the domestication processes has changed them. In the present study, a shotgun analysis was carried out to characterize the proteomes of five Opuntia species selected by its domestication degree. Our results will help to a better understanding of proteomic features underlying the selection and specialization under

  18. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  19. Shotgun proteomics of plant plasma membrane and microdomain proteins using nano-LC-MS/MS.

    Science.gov (United States)

    Takahashi, Daisuke; Li, Bin; Nakayama, Takato; Kawamura, Yukio; Uemura, Matsuo

    2014-01-01

    Shotgun proteomics allows the comprehensive analysis of proteins extracted from plant cells, subcellular organelles, and membranes. Previously, two-dimensional gel electrophoresis-based proteomics was used for mass spectrometric analysis of plasma membrane proteins. In order to get comprehensive proteome profiles of the plasma membrane including highly hydrophobic proteins with a number of transmembrane domains, a mass spectrometry-based shotgun proteomics method using nano-LC-MS/MS for proteins from the plasma membrane proteins and plasma membrane microdomain fraction is described. The results obtained are easily applicable to label-free protein semiquantification.

  20. Elucidation of taste- and odor-producing bacteria and toxigenic cyanobacteria in a Midwestern drinking water supply reservoir by shotgun metagenomics analysis

    Science.gov (United States)

    Otten, Timothy; Graham, Jennifer L.; Harris, Theodore D.; Dreher, Theo

    2016-01-01

    While commonplace in clinical settings, DNA-based assays for identification or enumeration of drinking water pathogens and other biological contaminants remain widely unadopted by the monitoring community. In this study, shotgun metagenomics was used to identify taste-and-odor producers and toxin-producing cyanobacteria over a 2-year period in a drinking water reservoir. The sequencing data implicated several cyanobacteria, including Anabaena spp.,Microcystis spp., and an unresolved member of the order Oscillatoriales as the likely principal producers of geosmin, microcystin, and 2-methylisoborneol (MIB), respectively. To further demonstrate this, quantitative PCR (qPCR) assays targeting geosmin-producing Anabaena and microcystin-producing Microcystis were utilized, and these data were fitted using generalized linear models and compared with routine monitoring data, including microscopic cell counts, sonde-based physicochemical analyses, and assays of all inorganic and organic nitrogen and phosphorus forms and fractions. The qPCR assays explained the greatest variation in observed geosmin (adjusted R2 = 0.71) and microcystin (adjusted R2 = 0.84) concentrations over the study period, highlighting their potential for routine monitoring applications. The origin of the monoterpene cyclase required for MIB biosynthesis was putatively linked to a periphytic cyanobacterial mat attached to the concrete drinking water inflow structure. We conclude that shotgun metagenomics can be used to identify microbial agents involved in water quality deterioration and to guide PCR assay selection or design for routine monitoring purposes. Finally, we offer estimates of microbial diversity and metagenomic coverage of our data sets for reference to others wishing to apply shotgun metagenomics to other lacustrine systems.

  1. Enhanced detection method for corneal protein identification using shotgun proteomics

    Directory of Open Access Journals (Sweden)

    Schlager John J

    2009-06-01

    Full Text Available Abstract Background The cornea is a specialized transparent connective tissue responsible for the majority of light refraction and image focus for the retina. There are three main layers of the cornea: the epithelium that is exposed and acts as a protective barrier for the eye, the center stroma consisting of parallel collagen fibrils that refract light, and the endothelium that is responsible for hydration of the cornea from the aqueous humor. Normal cornea is an immunologically privileged tissue devoid of blood vessels, but injury can produce a loss of these conditions causing invasion of other processes that degrade the homeostatic properties resulting in a decrease in the amount of light refracted onto the retina. Determining a measure and drift of phenotypic cornea state from normal to an injured or diseased state requires knowledge of the existing protein signature within the tissue. In the study of corneal proteins, proteomics procedures have typically involved the pulverization of the entire cornea prior to analysis. Separation of the epithelium and endothelium from the core stroma and performing separate shotgun proteomics using liquid chromatography/mass spectrometry results in identification of many more proteins than previously employed methods using complete pulverized cornea. Results Rabbit corneas were purchased, the epithelium and endothelium regions were removed, proteins processed and separately analyzed using liquid chromatography/mass spectrometry. Proteins identified from separate layers were compared against results from complete corneal samples. Protein digests were separated using a six hour liquid chromatographic gradient and ion-trap mass spectrometry used for detection of eluted peptide fractions. The SEQUEST database search results were filtered to allow only proteins with match probabilities of equal or better than 10-3 and peptides with a probability of 10-2 or less with at least two unique peptides isolated within

  2. Elucidation of Taste- and Odor-Producing Bacteria and Toxigenic Cyanobacteria in a Midwestern Drinking Water Supply Reservoir by Shotgun Metagenomic Analysis.

    Science.gov (United States)

    Otten, Timothy G; Graham, Jennifer L; Harris, Theodore D; Dreher, Theo W

    2016-09-01

    While commonplace in clinical settings, DNA-based assays for identification or enumeration of drinking water pathogens and other biological contaminants remain widely unadopted by the monitoring community. In this study, shotgun metagenomics was used to identify taste-and-odor producers and toxin-producing cyanobacteria over a 2-year period in a drinking water reservoir. The sequencing data implicated several cyanobacteria, including Anabaena spp., Microcystis spp., and an unresolved member of the order Oscillatoriales as the likely principal producers of geosmin, microcystin, and 2-methylisoborneol (MIB), respectively. To further demonstrate this, quantitative PCR (qPCR) assays targeting geosmin-producing Anabaena and microcystin-producing Microcystis were utilized, and these data were fitted using generalized linear models and compared with routine monitoring data, including microscopic cell counts, sonde-based physicochemical analyses, and assays of all inorganic and organic nitrogen and phosphorus forms and fractions. The qPCR assays explained the greatest variation in observed geosmin (adjusted R(2) = 0.71) and microcystin (adjusted R(2) = 0.84) concentrations over the study period, highlighting their potential for routine monitoring applications. The origin of the monoterpene cyclase required for MIB biosynthesis was putatively linked to a periphytic cyanobacterial mat attached to the concrete drinking water inflow structure. We conclude that shotgun metagenomics can be used to identify microbial agents involved in water quality deterioration and to guide PCR assay selection or design for routine monitoring purposes. Finally, we offer estimates of microbial diversity and metagenomic coverage of our data sets for reference to others wishing to apply shotgun metagenomics to other lacustrine systems. Cyanobacterial toxins and microbial taste-and-odor compounds are a growing concern for drinking water utilities reliant upon surface water resources. Specific

  3. Protein identification and quantification from riverbank grape, Vitis riparia: Comparing SDS-PAGE and FASP-GPF techniques for shotgun proteomic analysis.

    Science.gov (United States)

    George, Iniga S; Fennell, Anne Y; Haynes, Paul A

    2015-09-01

    Protein sample preparation optimisation is critical for establishing reproducible high throughput proteomic analysis. In this study, two different fractionation sample preparation techniques (in-gel digestion and in-solution digestion) for shotgun proteomics were used to quantitatively compare proteins identified in Vitis riparia leaf samples. The total number of proteins and peptides identified were compared between filter aided sample preparation (FASP) coupled with gas phase fractionation (GPF) and SDS-PAGE methods. There was a 24% increase in the total number of reproducibly identified proteins when FASP-GPF was used. FASP-GPF is more reproducible, less expensive and a better method than SDS-PAGE for shotgun proteomics of grapevine samples as it significantly increases protein identification across biological replicates. Total peptide and protein information from the two fractionation techniques is available in PRIDE with the identifier PXD001399 (http://proteomecentral.proteomexchange.org/dataset/PXD001399). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Revisiting Notechis scutatus venom: on shotgun proteomics and neutralization by the "bivalent" Sea Snake Antivenom.

    Science.gov (United States)

    Tan, Choo Hock; Tan, Kae Yi; Tan, Nget Hong

    2016-07-20

    Recent advances in proteomics enable deep profiling of the compositional details of snake venoms for improved understanding on envenomation pathophysiology and immunological neutralization. In this study, the venom of Australian tiger snake (Notechis scutatus) was trypsin-digested in solution and subjected to nano-ESI-LCMS/MS. Applying a relative quantitative proteomic approach, the findings revealed a proteome comprising 42 toxin subtypes clustered into 12 protein families. Phospholipases A2 constitute the most abundant toxins (74.5% of total venom proteins) followed by Kunitz serine protease inhibitors (6.9%), snake venom serine proteases (5.9%), alpha-neurotoxins (5.6%) and several toxins of lower abundance. The proteome correlates with N. scutatus envenoming effects including pre-synaptic and post-synaptic neurotoxicity and consumptive coagulopathy. The venom is highly lethal in mice (intravenous median lethal dose=0.09μg/g). BioCSL Sea Snake Antivenom, raised against the venoms of beaked sea snake (Hydrophis schistosus) and N. scutatus (added for enhanced immunogenicity), neutralized the lethal effect of N. scutatus venom (potency=2.95mg/ml) much more effectively than the targeted H.schistosus venom (potency=0.48mg/ml). The combined venom immunogen may have improved the neutralization against phospholipases A2 which are abundant in both venoms, but not short-neurotoxins which are predominant only in H. schistosus venom. A shotgun proteomic approach adopted in this study revealed the compositional details of the venom of common tiger snake from Australia, Notechis scutatus. The proteomic findings provided additional information on the relative abundances of toxins and the detection of proteins of minor expression unreported previously. The potent lethal effect of the venom was neutralized by bioCSL Sea Snake Antivenom, an anticipated finding due to the fact that the Sea Snake Antivenom is actually bivalent in nature, being raised against a mix of venoms of the

  5. The genome of flax (Linum usitatissimum) assembled de novo from short shotgun sequence reads

    DEFF Research Database (Denmark)

    Wang, Zhiwen; Hobson, Neil; Galindo, Leonardo

    2012-01-01

    Flax (Linum usitatissimum) is an ancient crop that is widely cultivated as a source of fiber, oil and medicinally relevant compounds. To accelerate crop improvement, we performed whole-genome shotgun sequencing of the nuclear genome of flax. Seven paired-end libraries ranging in size from 300 bp...... these results show that de novo assembly, based solely on whole-genome shotgun short-sequence reads, is an efficient means of obtaining nearly complete genome sequence information for some plant species....

  6. A high-throughput shotgun mutagenesis approach to mapping B-cell antibody epitopes.

    Science.gov (United States)

    Davidson, Edgar; Doranz, Benjamin J

    2014-09-01

    Characterizing the binding sites of monoclonal antibodies (mAbs) on protein targets, their 'epitopes', can aid in the discovery and development of new therapeutics, diagnostics and vaccines. However, the speed of epitope mapping techniques has not kept pace with the increasingly large numbers of mAbs being isolated. Obtaining detailed epitope maps for functionally relevant antibodies can be challenging, particularly for conformational epitopes on structurally complex proteins. To enable rapid epitope mapping, we developed a high-throughput strategy, shotgun mutagenesis, that enables the identification of both linear and conformational epitopes in a fraction of the time required by conventional approaches. Shotgun mutagenesis epitope mapping is based on large-scale mutagenesis and rapid cellular testing of natively folded proteins. Hundreds of mutant plasmids are individually cloned, arrayed in 384-well microplates, expressed within human cells, and tested for mAb reactivity. Residues are identified as a component of a mAb epitope if their mutation (e.g. to alanine) does not support candidate mAb binding but does support that of other conformational mAbs or allows full protein function. Shotgun mutagenesis is particularly suited for studying structurally complex proteins because targets are expressed in their native form directly within human cells. Shotgun mutagenesis has been used to delineate hundreds of epitopes on a variety of proteins, including G protein-coupled receptor and viral envelope proteins. The epitopes mapped on dengue virus prM/E represent one of the largest collections of epitope information for any viral protein, and results are being used to design better vaccines and drugs. © 2014 John Wiley & Sons Ltd.

  7. The genome of flax (Linum usitatissimum) assembled de novo from short shotgun sequence reads.

    Science.gov (United States)

    Wang, Zhiwen; Hobson, Neil; Galindo, Leonardo; Zhu, Shilin; Shi, Daihu; McDill, Joshua; Yang, Linfeng; Hawkins, Simon; Neutelings, Godfrey; Datla, Raju; Lambert, Georgina; Galbraith, David W; Grassa, Christopher J; Geraldes, Armando; Cronk, Quentin C; Cullis, Christopher; Dash, Prasanta K; Kumar, Polumetla A; Cloutier, Sylvie; Sharpe, Andrew G; Wong, Gane K-S; Wang, Jun; Deyholos, Michael K

    2012-11-01

    Flax (Linum usitatissimum) is an ancient crop that is widely cultivated as a source of fiber, oil and medicinally relevant compounds. To accelerate crop improvement, we performed whole-genome shotgun sequencing of the nuclear genome of flax. Seven paired-end libraries ranging in size from 300 bp to 10 kb were sequenced using an Illumina genome analyzer. A de novo assembly, comprised exclusively of deep-coverage (approximately 94× raw, approximately 69× filtered) short-sequence reads (44-100 bp), produced a set of scaffolds with N(50) =694 kb, including contigs with N(50)=20.1 kb. The contig assembly contained 302 Mb of non-redundant sequence representing an estimated 81% genome coverage. Up to 96% of published flax ESTs aligned to the whole-genome shotgun scaffolds. However, comparisons with independently sequenced BACs and fosmids showed some mis-assembly of regions at the genome scale. A total of 43384 protein-coding genes were predicted in the whole-genome shotgun assembly, and up to 93% of published flax ESTs, and 86% of A. thaliana genes aligned to these predicted genes, indicating excellent coverage and accuracy at the gene level. Analysis of the synonymous substitution rates (K(s) ) observed within duplicate gene pairs was consistent with a recent (5-9 MYA) whole-genome duplication in flax. Within the predicted proteome, we observed enrichment of many conserved domains (Pfam-A) that may contribute to the unique properties of this crop, including agglutinin proteins. Together these results show that de novo assembly, based solely on whole-genome shotgun short-sequence reads, is an efficient means of obtaining nearly complete genome sequence information for some plant species. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  8. A Novel Prosthetic Joint Infection Pathogen, Mycoplasma salivarium, Identified by Metagenomic Shotgun Sequencing.

    Science.gov (United States)

    Thoendel, Matthew; Jeraldo, Patricio; Greenwood-Quaintance, Kerryl E; Chia, Nicholas; Abdel, Matthew P; Steckelberg, James M; Osmon, Douglas R; Patel, Robin

    2017-07-15

    Defining the microbial etiology of culture-negative prosthetic joint infection (PJI) can be challenging. Metagenomic shotgun sequencing is a new tool to identify organisms undetected by conventional methods. We present a case where metagenomics was used to identify Mycoplasma salivarium as a novel PJI pathogen in a patient with hypogammaglobulinemia. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  9. A combined meta-barcoding and shotgun metagenomic analysis of spontaneous wine fermentation.

    Science.gov (United States)

    Sternes, Peter R; Lee, Danna; Kutyna, Dariusz R; Borneman, Anthony R

    2017-07-01

    Wine is a complex beverage, comprising hundreds of metabolites produced through the action of yeasts and bacteria in fermenting grape must. Commercially, there is now a growing trend away from using wine yeast (Saccharomyces) starter cultures, toward the historic practice of uninoculated or "wild" fermentation, where the yeasts and bacteria associated with the grapes and/or winery perform the fermentation. It is the varied metabolic contributions of these numerous non-Saccharomyces species that are thought to impart complexity and desirable taste and aroma attributes to wild ferments in comparison to their inoculated counterparts. To map the microflora of spontaneous fermentation, metagenomic techniques were employed to characterize and monitor the progression of fungal species in 5 different wild fermentations. Both amplicon-based ribosomal DNA internal transcribed spacer (ITS) phylotyping and shotgun metagenomics were used to assess community structure across different stages of fermentation. While providing a sensitive and highly accurate means of characterizing the wine microbiome, the shotgun metagenomic data also uncovered a significant overabundance bias in the ITS phylotyping abundance estimations for the common non-Saccharomyces wine yeast genus Metschnikowia. By identifying biases such as that observed for Metschnikowia, abundance measurements from future ITS phylotyping datasets can be corrected to provide more accurate species representation. Ultimately, as more shotgun metagenomic and single-strain de novo assemblies for key wine species become available, the accuracy of both ITS-amplicon and shotgun studies will greatly increase, providing a powerful methodology for deciphering the influence of the microbial community on the wine flavor and aroma. © The Authors 2017. Published by Oxford University Press.

  10. Midpregnancy Marriage and Divorce: Why the Death of Shotgun Marriage Has Been Greatly Exaggerated.

    Science.gov (United States)

    Gibson-Davis, Christina M; Ananat, Elizabeth O; Gassman-Pines, Anna

    2016-12-01

    Conventional wisdom holds that births following the colloquially termed "shotgun marriage"-that is, births to parents who married between conception and the birth-are nearing obsolescence. To investigate trends in shotgun marriage, we matched North Carolina administrative data on nearly 800,000 first births among white and black mothers to marriage and divorce records. We found that among married births, midpregnancy-married births (our preferred term for shotgun-married births) have been relatively stable at about 10 % over the past quarter-century while increasing substantially for vulnerable population subgroups. In 2012, among black and white less-educated and younger women, midpregnancy-married births accounted for approximately 20 % to 25 % of married first births. The increasing representation of midpregnancy-married births among married births raises concerns about well-being among at-risk families because midpregnancy marriages may be quite fragile. Our analysis revealed, however, that midpregnancy marriages were more likely to dissolve only among more advantaged groups. Of those groups considered to be most at risk of divorce-namely, black women with lower levels of education and who were younger-midpregnancy marriages had the same or lower likelihood of divorce as preconception marriages. Our results suggest an overlooked resiliency in a type of marriage that has only increased in salience.

  11. RePS: a sequence assembler that masks exact repeats identified from the shotgun data

    DEFF Research Database (Denmark)

    Wang, Jun; Wong, Gane Ka-Shu; Ni, Peixiang

    2002-01-01

    We describe a sequence assembler, RePS (repeat-masked Phrap with scaffolding), that explicitly identifies exact 20mer repeats from the shotgun data and removes them prior to the assembly. The established software is used to compute meaningful error probabilities for each base. Clone......-end-pairing information is used to construct scaffolds that order and orient the contigs. We show with real data for human and rice that reasonable assemblies are possible even at coverages of only 4x to 6x, despite having up to 42.2% in exact repeats. Udgivelsesdato: 2002-May...

  12. Tandem Mass Spectrum Sequencing: An Alternative to Database Search Engines in Shotgun Proteomics.

    Science.gov (United States)

    Muth, Thilo; Rapp, Erdmann; Berven, Frode S; Barsnes, Harald; Vaudel, Marc

    2016-01-01

    Protein identification via database searches has become the gold standard in mass spectrometry based shotgun proteomics. However, as the quality of tandem mass spectra improves, direct mass spectrum sequencing gains interest as a database-independent alternative. In this chapter, the general principle of this so-called de novo sequencing is introduced along with pitfalls and challenges of the technique. The main tools available are presented with a focus on user friendly open source software which can be directly applied in everyday proteomic workflows.

  13. HOMICIDE BY CERVICAL SPINAL CORD GUNSHOT INJURY WITH SHOTGUN FIRE PELLETS: CASE REPORT

    Directory of Open Access Journals (Sweden)

    Dana Turliuc, Serban Turliuc, Iustin Mihailov, Andrei Cucu, Gabriel Dumitrescu,Claudia Costea

    2015-10-01

    Full Text Available This case present a rare forensic case of cervical spinal gunshot injury of a female by her husband, a professional hunter, during a family fight with a shotgun fire pellets. The gunshot destroyed completely the cervical spinal cord, without injury to the neck vessels and organs and with the patient survival for seven days. We discuss notions of judicial ballistics, assessment of the patient with spinal cord gunshot injury and therapeutic strategies. Even if cervical spine gunshot injuries are most of the times lethal for majority of patients, the surviving patients need the coordination of a multidisciplinary surgical team to ensure the optimal functional prognostic.

  14. Population genetic analysis of shotgun assemblies of genomic sequences from multiple individuals

    DEFF Research Database (Denmark)

    Hellmann, Ines; Mang, Yuan; Gu, Zhiping

    2008-01-01

    We introduce a simple, broadly applicable method for obtaining estimates of nucleotide diversity from genomic shotgun sequencing data. The method takes into account the special nature of these data: random sampling of genomic segments from one or more individuals and a relatively high error rate...... for individual reads. Applying this method to data from the Celera human genome sequencing and SNP discovery project, we obtain estimates of nucleotide diversity in windows spanning the human genome and show that the diversity to divergence ratio is reduced in regions of low recombination. Furthermore, we show...

  15. Organization and evolution of primate centromeric DNA from whole-genome shotgun sequence data.

    Directory of Open Access Journals (Sweden)

    Can Alkan

    2007-09-01

    Full Text Available The major DNA constituent of primate centromeres is alpha satellite DNA. As much as 2%-5% of sequence generated as part of primate genome sequencing projects consists of this material, which is fragmented or not assembled as part of published genome sequences due to its highly repetitive nature. Here, we develop computational methods to rapidly recover and categorize alpha-satellite sequences from previously uncharacterized whole-genome shotgun sequence data. We present an algorithm to computationally predict potential higher-order array structure based on paired-end sequence data and then experimentally validate its organization and distribution by experimental analyses. Using whole-genome shotgun data from the human, chimpanzee, and macaque genomes, we examine the phylogenetic relationship of these sequences and provide further support for a model for their evolution and mutation over the last 25 million years. Our results confirm fundamental differences in the dispersal and evolution of centromeric satellites in the Old World monkey and ape lineages of evolution.

  16. Shotgun glycomics of pig lung identifies natural endogenous receptors for influenza viruses.

    Science.gov (United States)

    Byrd-Leotis, Lauren; Liu, Renpeng; Bradley, Konrad C; Lasanajak, Yi; Cummings, Sandra F; Song, Xuezheng; Heimburg-Molinaro, Jamie; Galloway, Summer E; Culhane, Marie R; Smith, David F; Steinhauer, David A; Cummings, Richard D

    2014-06-03

    Influenza viruses bind to host cell surface glycans containing terminal sialic acids, but as studies on influenza binding become more sophisticated, it is becoming evident that although sialic acid may be necessary, it is not sufficient for productive binding. To better define endogenous glycans that serve as viral receptors, we have explored glycan recognition in the pig lung, because influenza is broadly disseminated in swine, and swine have been postulated as an intermediary host for the emergence of pandemic strains. For these studies, we used the technology of "shotgun glycomics" to identify natural receptor glycans. The total released N- and O-glycans from pig lung glycoproteins and glycolipid-derived glycans were fluorescently tagged and separated by multidimensional HPLC, and individual glycans were covalently printed to generate pig lung shotgun glycan microarrays. All viruses tested interacted with one or more sialylated N-glycans but not O-glycans or glycolipid-derived glycans, and each virus demonstrated novel and unexpected differences in endogenous N-glycan recognition. The results illustrate the repertoire of specific, endogenous N-glycans of pig lung glycoproteins for virus recognition and offer a new direction for studying endogenous glycan functions in viral pathogenesis.

  17. Automated and Accurate Estimation of Gene Family Abundance from Shotgun Metagenomes.

    Directory of Open Access Journals (Sweden)

    Stephen Nayfach

    2015-11-01

    Full Text Available Shotgun metagenomic DNA sequencing is a widely applicable tool for characterizing the functions that are encoded by microbial communities. Several bioinformatic tools can be used to functionally annotate metagenomes, allowing researchers to draw inferences about the functional potential of the community and to identify putative functional biomarkers. However, little is known about how decisions made during annotation affect the reliability of the results. Here, we use statistical simulations to rigorously assess how to optimize annotation accuracy and speed, given parameters of the input data like read length and library size. We identify best practices in metagenome annotation and use them to guide the development of the Shotgun Metagenome Annotation Pipeline (ShotMAP. ShotMAP is an analytically flexible, end-to-end annotation pipeline that can be implemented either on a local computer or a cloud compute cluster. We use ShotMAP to assess how different annotation databases impact the interpretation of how marine metagenome and metatranscriptome functional capacity changes across seasons. We also apply ShotMAP to data obtained from a clinical microbiome investigation of inflammatory bowel disease. This analysis finds that gut microbiota collected from Crohn's disease patients are functionally distinct from gut microbiota collected from either ulcerative colitis patients or healthy controls, with differential abundance of metabolic pathways related to host-microbiome interactions that may serve as putative biomarkers of disease.

  18. The temporal analysis of yeast exponential phase using shotgun proteomics as a fermentation monitoring technique.

    Science.gov (United States)

    Huang, Eric L; Orsat, Valérie; Shah, Manesh B; Hettich, Robert L; VerBerkmoes, Nathan C; Lefsrud, Mark G

    2012-09-18

    System biology and bioprocess technology can be better understood using shotgun proteomics as a monitoring system during the fermentation. We demonstrated a shotgun proteomic method to monitor the temporal yeast proteome in early, middle and late exponential phases. Our study identified a total of 1389 proteins combining all 2D-LC-MS/MS runs. The temporal Saccharomyces cerevisiae proteome was enriched with proteolysis, radical detoxification, translation, one-carbon metabolism, glycolysis and TCA cycle. Heat shock proteins and proteins associated with oxidative stress response were found throughout the exponential phase. The most abundant proteins observed were translation elongation factors, ribosomal proteins, chaperones and glycolytic enzymes. The high abundance of the H-protein of the glycine decarboxylase complex (Gcv3p) indicated the availability of glycine in the environment. We observed differentially expressed proteins and the induced proteins at mid-exponential phase were involved in ribosome biogenesis, mitochondria DNA binding/replication and transcriptional activator. Induction of tryptophan synthase (Trp5p) indicated the abundance of tryptophan during the fermentation. As fermentation progressed toward late exponential phase, a decrease in cell proliferation was implied from the repression of ribosomal proteins, transcription coactivators, methionine aminopeptidase and translation-associated proteins. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Organization and evolution of primate centromeric DNA from whole-genome shotgun sequence data.

    Science.gov (United States)

    Alkan, Can; Ventura, Mario; Archidiacono, Nicoletta; Rocchi, Mariano; Sahinalp, S Cenk; Eichler, Evan E

    2007-09-01

    The major DNA constituent of primate centromeres is alpha satellite DNA. As much as 2%-5% of sequence generated as part of primate genome sequencing projects consists of this material, which is fragmented or not assembled as part of published genome sequences due to its highly repetitive nature. Here, we develop computational methods to rapidly recover and categorize alpha-satellite sequences from previously uncharacterized whole-genome shotgun sequence data. We present an algorithm to computationally predict potential higher-order array structure based on paired-end sequence data and then experimentally validate its organization and distribution by experimental analyses. Using whole-genome shotgun data from the human, chimpanzee, and macaque genomes, we examine the phylogenetic relationship of these sequences and provide further support for a model for their evolution and mutation over the last 25 million years. Our results confirm fundamental differences in the dispersal and evolution of centromeric satellites in the Old World monkey and ape lineages of evolution.

  20. Large pore dermal microdialysis and liquid chromatography-tandem mass spectroscopy shotgun proteomic analysis: a feasibility study

    DEFF Research Database (Denmark)

    Petersen, Lars J.; Sorensen, Mette A.; Codrea, Marius C.

    2013-01-01

    Background/AimsThe purpose of the present pilot study was to investigate the feasibility of combining large pore dermal microdialysis with shotgun proteomic analysis in human skin. MethodsDialysate was recovered from human skin by 2000 kDa microdialysis membranes from one subject at three different...

  1. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  2. High-content screening of yeast mutant libraries by shotgun lipidomics

    DEFF Research Database (Denmark)

    Tarasov, Kirill; Stefanko, Adam; Casanovas, Albert

    2014-01-01

    To identify proteins with a functional role in lipid metabolism and homeostasis we designed a high-throughput platform for high-content lipidomic screening of yeast mutant libraries. To this end, we combined culturing and lipid extraction in 96-well format, automated direct infusion...... factor KAR4 precipitated distinct lipid metabolic phenotypes. These results demonstrate that the high-throughput shotgun lipidomics platform is a valid and complementary proxy for high-content screening of yeast mutant libraries....... nanoelectrospray ionization, high-resolution Orbitrap mass spectrometry, and a dedicated data processing framework to support lipid phenotyping across hundreds of Saccharomyces cerevisiae mutants. Our novel approach revealed that the absence of genes with unknown function YBR141C and YJR015W, and the transcription...

  3. Analysis of secretome of breast cancer cell line with an optimized semi-shotgun method

    International Nuclear Information System (INIS)

    Tang Xiaorong; Yao Ling; Chen Keying; Hu Xiaofang; Xu Lisa; Fan Chunhai

    2009-01-01

    Secretome, the totality of secreted proteins, is viewed as a promising pool of candidate cancer biomarkers. Simple and reliable methods for identifying secreted proteins are highly desired. We used an optimized semi-shotgun liquid chromatography followed by tandem mass spectrometry (LC-MS/MS) method to analyze the secretome of breast cancer cell line MDA-MB-231. A total of 464 proteins were identified. About 63% of the proteins were classified as secreted proteins, including many promising breast cancer biomarkers, which were thought to be correlated with tumorigenesis, tumor development and metastasis. These results suggest that the optimized method may be a powerful strategy for cell line secretome profiling, and can be used to find potential cancer biomarkers with great clinical significance. (authors)

  4. Shotgun metagenomics of 250 adult twins reveals genetic and environmental impacts on the gut microbiome

    DEFF Research Database (Denmark)

    Xie, Hailiang; Guo, Ruijin; Zhong, Huanzi

    2016-01-01

    The gut microbiota has been typically viewed as an environmental factor for human health. Twins are well suited for investigating the concordance of their gut microbiomes and decomposing genetic and environmental influences. However, existing twin studies utilizing metagenomic shotgun sequencing...... have included only a few samples. Here, we sequenced fecal samples from 250 adult twins in the TwinsUK registry and constructed a comprehensive gut microbial reference gene catalog. We demonstrate heritability of many microbial taxa and functional modules in the gut microbiome, including those...... associated with diseases. Moreover, we identified 8 million SNPs in the gut microbiome and observe a high similarity in microbiome SNPs between twins that slowly decreases after decades of living apart. The results shed new light on the genetic and environmental influences on the composition and function...

  5. Structural characterization of ether lipids from the archaeon Sulfolobus islandicus by high-resolution shotgun lipidomics

    DEFF Research Database (Denmark)

    Jensen, Sara Munk; Brandl, Martin; Treusch, Alexander H

    2015-01-01

    The molecular structures, biosynthetic pathways and physiological functions of membrane lipids produced by organisms in the domain Archaea are poorly characterized as compared with that of counterparts in Bacteria and Eukaryota. Here we report on the use of high-resolution shotgun lipidomics......-resolution Fourier transform mass spectrometry using an ion trap-orbitrap mass spectrometer. This analysis identified five clusters of molecular ions that matched ether lipids in the database with sub-ppm mass accuracy. To structurally characterize and validate the identities of the potential lipid species, we...... performed structural analysis using multistage activation on the ion trap-orbitrap instrument as well as tandem mass analysis using a quadrupole time-of-flight machine. Our analysis identified four ether lipid species previously reported in Archaea, and one ether lipid species that had not been described...

  6. Fine-scale variation in meiotic recombination in Mimulus inferred from population shotgun sequencing

    Energy Technology Data Exchange (ETDEWEB)

    Hellsten, Uffe [USDOE Joint Genome Inst., Walnut Creek, CA (United States); Wright, Kevin M. [Harvard Univ., Cambridge, MA (United States); Jenkins, Jerry [USDOE Joint Genome Inst., Walnut Creek, CA (United States); HudsonAlpha Inst. of Biotechnology, Huntsville, AL (United States); Shu, Shengqiang [USDOE Joint Genome Inst., Walnut Creek, CA (United States); Yuan, Yao-Wu [Univ. of Connecticut, Storrs, CT (United States); Wessler, Susan R. [Univ. of California, Riverside, CA (United States); Schmutz, Jeremy [USDOE Joint Genome Inst., Walnut Creek, CA (United States); HudsonAlpha Inst. of Biotechnology, Huntsville, AL (United States); Willis, John H. [Duke Univ., Durham, NC (United States); Rokhsar, Daniel S. [USDOE Joint Genome Inst., Walnut Creek, CA (United States); Univ. of California, Berkeley, CA (United States)

    2013-11-13

    Meiotic recombination rates can vary widely across genomes, with hotspots of intense activity interspersed among cold regions. In yeast, hotspots tend to occur in promoter regions of genes, whereas in humans and mice hotspots are largely defined by binding sites of the PRDM9 protein. To investigate the detailed recombination pattern in a flowering plant we use shotgun resequencing of a wild population of the monkeyflower Mimulus guttatus to precisely locate over 400,000 boundaries of historic crossovers or gene conversion tracts. Their distribution defines some 13,000 hotspots of varying strengths, interspersed with cold regions of undetectably low recombination. Average recombination rates peak near starts of genes and fall off sharply, exhibiting polarity. Within genes, recombination tracts are more likely to terminate in exons than in introns. The general pattern is similar to that observed in yeast, as well as in PRDM9-knockout mice, suggesting that recombination initiation described here in Mimulus may reflect ancient and conserved eukaryotic mechanisms

  7. Plastic litter from shotgun ammunition on Danish coastlines - Amounts and provenance.

    Science.gov (United States)

    Kanstrup, Niels; Balsby, Thorsten J S

    2018-06-01

    Plastic litter in the marine environment is a major global issue. Discarded plastic shotgun ammunition shells and discharged wads are an unwelcome addition and feature among the top ten litter items found on reference beaches in Denmark. To understand this problem, its scale and origins, collections were made by volunteers along Danish coastal shorelines. In all 3669 plastic ammunition items were collected at 68 sites along 44.6 km of shoreline. The collected items were scored for characteristic variables such as gauge and length, shot type, and the legibility of text, the erosion, and the presence of metallic components. Scores for characteristics were related to the site, area, and season and possible influences discussed. The prevalence of collected plastic shotgun litter ranges from zero to 41 items per 100 m with an average of 3.7 items per 100 m. Most ammunition litter on Danish coasts originates from hunting on Danish coastal waterbodies, but a small amount may come from further afield. North Sea coasts are the most distinctive suggesting the possible contribution of long distance drift as well as the likelihood that such litter can persist in marine habitats for decades. The pathway from initial discard to eventual wash-up and collection depends on the physical properties of plastic components, marine tides and currents, coastal topography and shoreline vegetation. Judging from the disintegration of the cartridge and the wear and decomposition of components, we conclude that there is a substantial supply of polluting plastic ammunition materials that has and will accumulate. These plastic items pose a hazard to marine ecosystems and wash up on coasts for many years to come. We recommend that responsible managers, hunters and ammunition manufacturers will take action now to reduce the problem and, thereby, protect ecosystems, wildlife and the sustainability of hunting. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. A statistical approach designed for finding mathematically defined repeats in shotgun data and determining the length distribution of clone-inserts

    DEFF Research Database (Denmark)

    Zhong, Lan; Zhang, Kunlin; Huang, Xiangang

    2003-01-01

    that repeats of different copy number have different probabilities of appearance in shotgun data, so based on this principle, we constructed a statistical model and inferred criteria for mathematically defined repeats (MDRs) at different shotgun coverages. According to these criteria, we developed software...... MDRmasker to identify and mask MDRs in shotgun data. With repeats masked prior to assembly, the speed of assembly was increased with lower error probability. In addition, clone-insert size affect the accuracy of repeat assembly and scaffold construction, we also designed length distribution of clone...

  9. The lipidomes of vesicular stomatitis virus, semliki forest virus, and the host plasma membrane analyzed by quantitative shotgun mass spectrometry

    DEFF Research Database (Denmark)

    Kalvodova, Lucie; Sampaio, Julio L; Cordo, Sandra

    2009-01-01

    kidney cells can be infected by two different viruses, namely, vesicular stomatitis virus and Semliki Forest virus, from the Rhabdoviridae and Togaviridae families, respectively. We purified the host plasma membrane and the two different viruses after exit from the host cells and analyzed the lipid...

  10. metaBIT, an integrative and automated metagenomic pipeline for analysing microbial profiles from high-throughput sequencing shotgun data

    DEFF Research Database (Denmark)

    Louvel, Guillaume; Der Sarkissian, Clio; Hanghøj, Kristian Ebbesen

    2016-01-01

    -throughput DNA sequencing (HTS). Here, we develop metaBIT, an open-source computational pipeline automatizing routine microbial profiling of shotgun HTS data. Customizable by the user at different stringency levels, it performs robust taxonomy-based assignment and relative abundance calculation of microbial taxa......, as well as cross-sample statistical analyses of microbial diversity distributions. We demonstrate the versatility of metaBIT within a range of published HTS data sets sampled from the environment (soil and seawater) and the human body (skin and gut), but also from archaeological specimens. We present......-friendly profiling of the microbial DNA present in HTS shotgun data sets. The applications of metaBIT are vast, from monitoring of laboratory errors and contaminations, to the reconstruction of past and present microbiota, and the detection of candidate species, including pathogens....

  11. Nonoperative Management of Multiple Penetrating Cardiac and Colon Wounds from a Shotgun: A Case Report and Literature Review

    OpenAIRE

    Jaramillo, Paula M.; Montoya, Jaime A.; Mejia, David A.; Pereira Warr, Salin

    2018-01-01

    Introduction. Surgery for cardiac trauma is considered fatal and for wounds of the colon by associated sepsis is normally considered; however, conservative management of many traumatic lesions of different injured organs has progressed over the years. Presentation of the Case. A 65-year-old male patient presented with multiple shotgun wounds on the left upper limb, thorax, and abdomen. On evaluation, he was hemodynamically stable with normal sinus rhythm and normal blood pressure, no dyspnea,...

  12. Use of Metagenomic Shotgun Sequencing Technology To Detect Foodborne Pathogens within the Microbiome of the Beef Production Chain

    OpenAIRE

    Yang, Xiang; Noyes, Noelle R.; Doster, Enrique; Martin, Jennifer N.; Linke, Lyndsey M.; Magnuson, Roberta J.; Yang, Hua; Geornaras, Ifigenia; Woerner, Dale R.; Jones, Kenneth L.; Ruiz, Jaime; Boucher, Christina; Morley, Paul S.; Belk, Keith E.

    2016-01-01

    Foodborne illnesses associated with pathogenic bacteria are a global public health and economic challenge. The diversity of microorganisms (pathogenic and nonpathogenic) that exists within the food and meat industries complicates efforts to understand pathogen ecology. Further, little is known about the interaction of pathogens within the microbiome throughout the meat production chain. Here, a metagenomic approach and shotgun sequencing technology were used as tools to detect pathogenic bact...

  13. Identification of antimicrobial resistance genes in multidrug-resistant clinical Bacteroides fragilis isolates by whole genome shotgun sequencing

    DEFF Research Database (Denmark)

    Sydenham, Thomas Vognbjerg; Sóki, József; Hasman, Henrik

    2015-01-01

    Bacteroides fragilis constitutes the most frequent anaerobic bacterium causing bacteremia in humans. The genetic background for antimicrobial resistance in B. fragilis is diverse with some genes requiring insertion sequence (IS) elements inserted upstream for increased expression. To evaluate whole...... genome shotgun sequencing as a method for predicting antimicrobial resistance properties, one meropenem resistant and five multidrug-resistant blood culture isolates were sequenced and antimicrobial resistance genes and IS elements identified using ResFinder 2.1 (http...

  14. Improvement of a sample preparation method assisted by sodium deoxycholate for mass-spectrometry-based shotgun membrane proteomics.

    Science.gov (United States)

    Lin, Yong; Lin, Haiyan; Liu, Zhonghua; Wang, Kunbo; Yan, Yujun

    2014-11-01

    In current shotgun-proteomics-based biological discovery, the identification of membrane proteins is a challenge. This is especially true for integral membrane proteins due to their highly hydrophobic nature and low abundance. Thus, much effort has been directed at sample preparation strategies such as use of detergents, chaotropes, and organic solvents. We previously described a sample preparation method for shotgun membrane proteomics, the sodium deoxycholate assisted method, which cleverly circumvents many of the challenges associated with traditional sample preparation methods. However, the method is associated with significant sample loss due to the slightly weaker extraction/solubilization ability of sodium deoxycholate when it is used at relatively low concentrations such as 1%. Hence, we present an enhanced sodium deoxycholate sample preparation strategy that first uses a high concentration of sodium deoxycholate (5%) to lyse membranes and extract/solubilize hydrophobic membrane proteins, and then dilutes the detergent to 1% for a more efficient digestion. We then applied the improved method to shotgun analysis of proteins from rat liver membrane enriched fraction. Compared with other representative sample preparation strategies including our previous sodium deoxycholate assisted method, the enhanced sodium deoxycholate method exhibited superior sensitivity, coverage, and reliability for the identification of membrane proteins particularly those with high hydrophobicity and/or multiple transmembrane domains. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Large pore dermal microdialysis and liquid chromatography-tandem mass spectroscopy shotgun proteomic analysis: a feasibility study.

    Science.gov (United States)

    Petersen, Lars J; Sørensen, Mette A; Codrea, Marius C; Zacho, Helle D; Bendixen, Emøke

    2013-11-01

    The purpose of the present pilot study was to investigate the feasibility of combining large pore dermal microdialysis with shotgun proteomic analysis in human skin. Dialysate was recovered from human skin by 2000 kDa microdialysis membranes from one subject at three different phases of the study; trauma due to implantation of the dialysis device, a post implantation steady-state period, and after induction of vasodilatation and plasma extravasation. For shotgun proteomics, the proteins were extracted and digested with trypsin. Peptides were separated by capillary and nanoflow HPLC systems, followed by tandem mass spectrometry (MS/MS) on a Quadrupole-TOF hybrid instrument. The MS/MS spectra were merged and mapped to a human target protein database to achieve peptide identification and protein inference. Results showed variation in protein amounts and profiles for each of the different sampling phases. The total protein concentration was 1.7, 0.6, and 1.3 mg/mL during the three phases, respectively. A total of 158 different proteins were identified. Immunoglobulins and the major classes of plasma proteins, including proteases, coagulation factors, apolipoproteins, albumins, and complement factors, make up the major load of proteins in all three test conditions. Shotgun proteomics allowed the identification of more than 150 proteins in microdialysis samples from human skin. This highlights the opportunities of LC-MS/MS to study the complex molecular interactions in the skin. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Comparative evaluation of seven commercial products for human serum enrichment/depletion by shotgun proteomics.

    Science.gov (United States)

    Pisanu, Salvatore; Biosa, Grazia; Carcangiu, Laura; Uzzau, Sergio; Pagnozzi, Daniela

    2018-08-01

    Seven commercial products for human serum depletion/enrichment were tested and compared by shotgun proteomics. Methods were based on four different capturing agents: antibodies (Qproteome Albumin/IgG Depletion kit, ProteoPrep Immunoaffinity Albumin and IgG Depletion Kit, Top 2 Abundant Protein Depletion Spin Columns, and Top 12 Abundant Protein Depletion Spin Columns), specific ligands (Albumin/IgG Removal), mixture of antibodies and ligands (Albumin and IgG Depletion SpinTrap), and combinatorial peptide ligand libraries (ProteoMiner beads), respectively. All procedures, to a greater or lesser extent, allowed an increase of identified proteins. ProteoMiner beads provided the highest number of proteins; Albumin and IgG Depletion SpinTrap and ProteoPrep Immunoaffinity Albumin and IgG Depletion Kit resulted the most efficient in albumin removal; Top 2 and Top 12 Abundant Protein Depletion Spin Columns decreased the overall immunoglobulin levels more than other procedures, whereas specifically gamma immunoglobulins were mostly removed by Albumin and IgG Depletion SpinTrap, ProteoPrep Immunoaffinity Albumin and IgG Depletion Kit, and Top 2 Abundant Protein Depletion Spin Columns. Albumin/IgG Removal, a resin bound to a mixture of protein A and Cibacron Blue, behaved less efficiently than the other products. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Iatrogenic artefacts attributable to traditional cupping therapy in a shotgun fatality.

    Science.gov (United States)

    Cavlak, Mehmet; Özkök, Alper; Sarı, Serhat; Dursun, Ahmet; Akar, Taner; Karapirli, Mustafa; Demirel, Birol

    2015-10-01

    Cupping is a traditional treatment method that has been used for thousands of years to diminish pain, restore appetite and improve digestion, remove tendency to faint or remove 'bad blood' from the body. The suction of the cup is created by fire or mechanical devices. This procedure may result in circular erythema, petechiae, purpura, ecchymosis, burns and may be mistaken for trauma-related ecchymosis or livor mortis. Forty-year-old male was died by shotgun injuries in the same day of the wounding. Circular ecchymoses were observed on the forehead, within the scalp of occipital region, the back of the neck, and on the back. They were defined as ecchymoses in the first examination made by a general practitioner. In the external examination during the legal autopsy superficial incisions were observed on the circular ecchymoses. The shape, localization and color of and the characteristics of incisions on the circular lesions were concluded to be caused by the dry cupping therapy and wet cupping therapy procedures. These lesions and their formation mechanisms should be well-known by the forensic medical examiners and the other medical personnel involved in the forensic medical examination. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  18. Uncertainty estimation of predictions of peptides' chromatographic retention times in shotgun proteomics.

    Science.gov (United States)

    Maboudi Afkham, Heydar; Qiu, Xuanbin; The, Matthew; Käll, Lukas

    2017-02-15

    Liquid chromatography is frequently used as a means to reduce the complexity of peptide-mixtures in shotgun proteomics. For such systems, the time when a peptide is released from a chromatography column and registered in the mass spectrometer is referred to as the peptide's retention time . Using heuristics or machine learning techniques, previous studies have demonstrated that it is possible to predict the retention time of a peptide from its amino acid sequence. In this paper, we are applying Gaussian Process Regression to the feature representation of a previously described predictor E lude . Using this framework, we demonstrate that it is possible to estimate the uncertainty of the prediction made by the model. Here we show how this uncertainty relates to the actual error of the prediction. In our experiments, we observe a strong correlation between the estimated uncertainty provided by Gaussian Process Regression and the actual prediction error. This relation provides us with new means for assessment of the predictions. We demonstrate how a subset of the peptides can be selected with lower prediction error compared to the whole set. We also demonstrate how such predicted standard deviations can be used for designing adaptive windowing strategies. lukas.kall@scilifelab.se. Our software and the data used in our experiments is publicly available and can be downloaded from https://github.com/statisticalbiotechnology/GPTime . © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. IdentiPy: an extensible search engine for protein identification in shotgun proteomics.

    Science.gov (United States)

    Levitsky, Lev I; Ivanov, Mark V; Lobas, Anna A; Bubis, Julia A; Tarasova, Irina A; Solovyeva, Elizaveta M; Pridatchenko, Marina L; Gorshkov, Mikhail V

    2018-04-23

    We present an open-source, extensible search engine for shotgun proteomics. Implemented in Python programming language, IdentiPy shows competitive processing speed and sensitivity compared with the state-of-the-art search engines. It is equipped with a user-friendly web interface, IdentiPy Server, enabling the use of a single server installation accessed from multiple workstations. Using a simplified version of X!Tandem scoring algorithm and its novel ``auto-tune'' feature, IdentiPy outperforms the popular alternatives on high-resolution data sets. Auto-tune adjusts the search parameters for the particular data set, resulting in improved search efficiency and simplifying the user experience. IdentiPy with the auto-tune feature shows higher sensitivity compared with the evaluated search engines. IdentiPy Server has built-in post-processing and protein inference procedures and provides graphic visualization of the statistical properties of the data set and the search results. It is open-source and can be freely extended to use third-party scoring functions or processing algorithms, and allows customization of the search workflow for specialized applications.

  20. MaRaCluster: A Fragment Rarity Metric for Clustering Fragment Spectra in Shotgun Proteomics.

    Science.gov (United States)

    The, Matthew; Käll, Lukas

    2016-03-04

    Shotgun proteomics experiments generate large amounts of fragment spectra as primary data, normally with high redundancy between and within experiments. Here, we have devised a clustering technique to identify fragment spectra stemming from the same species of peptide. This is a powerful alternative method to traditional search engines for analyzing spectra, specifically useful for larger scale mass spectrometry studies. As an aid in this process, we propose a distance calculation relying on the rarity of experimental fragment peaks, following the intuition that peaks shared by only a few spectra offer more evidence than peaks shared by a large number of spectra. We used this distance calculation and a complete-linkage scheme to cluster data from a recent large-scale mass spectrometry-based study. The clusterings produced by our method have up to 40% more identified peptides for their consensus spectra compared to those produced by the previous state-of-the-art method. We see that our method would advance the construction of spectral libraries as well as serve as a tool for mining large sets of fragment spectra. The source code and Ubuntu binary packages are available at https://github.com/statisticalbiotechnology/maracluster (under an Apache 2.0 license).

  1. Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.

    Science.gov (United States)

    Trudgian, David C; Mirzaei, Hamid

    2012-12-07

    We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.

  2. Structural Analysis of Unsaturated Glycosphingolipids Using Shotgun Ozone-Induced Dissociation Mass Spectrometry

    Science.gov (United States)

    Barrientos, Rodell C.; Vu, Ngoc; Zhang, Qibin

    2017-08-01

    Glycosphingolipids are essential biomolecules widely distributed across biological kingdoms yet remain relatively underexplored owing to both compositional and structural complexity. While the glycan head group has been the subject of most studies, there is paucity of reports on the lipid moiety, particularly the location of unsaturation. In this paper, ozone-induced dissociation mass spectrometry (OzID-MS) implemented in a traveling wave-based quadrupole time-of-flight (Q-ToF) mass spectrometer was applied to study unsaturated glycosphingolipids using shotgun approach. Resulting high resolution mass spectra facilitated the unambiguous identification of diagnostic OzID product ions. Using [M+Na]+ adducts of authentic standards, we observed that the long chain base and fatty acyl unsaturation had distinct reactivity with ozone. The reactivity of unsaturation in the fatty acyl chain was about 8-fold higher than that in the long chain base, which enables their straightforward differentiation. Influence of the head group, fatty acyl hydroxylation, and length of fatty acyl chain on the oxidative cleavage of double bonds was also observed. Application of this technique to bovine brain galactocerebrosides revealed co-isolated isobaric and regioisomeric species, which otherwise would be incompletely identified using contemporary collision-induced dissociation (CID) alone. These results highlight the potential of OzID-MS in glycosphingolipids research, which not only provides complementary structural information to existing CID technique but also facilitates de novo structural determination of these complex biomolecules. [Figure not available: see fulltext.

  3. Shotgun proteomic analytical approach for studying proteins adsorbed onto liposome surface

    KAUST Repository

    Capriotti, Anna Laura

    2011-07-02

    The knowledge about the interaction between plasma proteins and nanocarriers employed for in vivo delivery is fundamental to understand their biodistribution. Protein adsorption onto nanoparticle surface (protein corona) is strongly affected by vector surface characteristics. In general, the primary interaction is thought to be electrostatic, thus surface charge of carrier is supposed to play a central role in protein adsorption. Because protein corona composition can be critical in modifying the interactive surface that is recognized by cells, characterizing its formation onto lipid particles may serve as a fundamental predictive model for the in vivo efficiency of a lipidic vector. In the present work, protein coronas adsorbed onto three differently charged cationic liposome formulations were compared by a shotgun proteomic approach based on nano-liquid chromatography-high-resolution mass spectrometry. About 130 proteins were identified in each corona, with only small differences between the different cationic liposome formulations. However, this study could be useful for the future controlled design of colloidal drug carriers and possibly in the controlled creation of biocompatible surfaces of other devices that come into contact with proteins into body fluids. © 2011 Springer-Verlag.

  4. Paleogenomics in a temperate environment: shotgun sequencing from an extinct Mediterranean caprine.

    Directory of Open Access Journals (Sweden)

    Oscar Ramírez

    Full Text Available BACKGROUND: Numerous endemic mammals, including dwarf elephants, goats, hippos and deers, evolved in isolation in the Mediterranean islands during the Pliocene and Pleistocene. Most of them subsequently became extinct during the Holocene. Recently developed high-throughput sequencing technologies could provide a unique tool for retrieving genomic data from these extinct species, making it possible to study their evolutionary history and the genetic bases underlying their particular, sometimes unique, adaptations. METHODOLOGY/PRINCIPALS FINDINGS: A DNA extraction of a approximately 6,000 year-old bone sample from an extinct caprine (Myotragus balearicus from the Balearic Islands in the Western Mediterranean, has been subjected to shotgun sequencing with the GS FLX 454 platform. Only 0.27% of the resulting sequences, identified from alignments with the cow genome and comprising 15,832 nucleotides, with an average length of 60 nucleotides, proved to be endogenous. CONCLUSIONS: A phylogenetic tree generated with Myotragus sequences and those from other artiodactyls displays an identical topology to that generated from mitochondrial DNA data. Despite being in an unfavourable thermal environment, which explains the low yield of endogenous sequences, our study demonstrates that it is possible to obtain genomic data from extinct species from temperate regions.

  5. RNA shotgun metagenomic sequencing of northern California (USA mosquitoes uncovers viruses, bacteria, and fungi

    Directory of Open Access Journals (Sweden)

    James Angus eChandler

    2015-03-01

    Full Text Available Mosquitoes, most often recognized for the microbial agents of disease they may carry, harbor diverse microbial communities that include viruses, bacteria, and fungi, collectively called the microbiota. The composition of the microbiota can directly and indirectly affect disease transmission through microbial interactions that could be revealed by its characterization in natural populations of mosquitoes. Furthermore, the use of shotgun metagenomic sequencing (SMS approaches could allow the discovery of unknown members of the microbiota. In this study, we use RNA SMS to characterize the microbiota of seven individual mosquitoes (species include Culex pipiens, Culiseta incidens, and Ochlerotatus sierrensis collected from a variety of habitats in California, USA. Sequencing was performed on the Illumina HiSeq platform and the resulting sequences were quality-checked and assembled into contigs using the A5 pipeline. Sequences related to single stranded RNA viruses of the Bunyaviridae and Rhabdoviridae were uncovered, along with an unclassified genus of double-stranded RNA viruses. Phylogenetic analysis finds that in all three cases, the closest relatives of the identified viral sequences are other mosquito-associated viruses, suggesting widespread host-group specificity among disparate viral taxa. Interestingly, we identified a Narnavirus of fungi, also reported elsewhere in mosquitoes, that potentially demonstrates a nested host-parasite association between virus, fungi, and mosquito. Sequences related to 8 bacterial families and 13 fungal families were found across the seven samples. Bacillus and Escherichia/Shigella were identified in all samples and Wolbachia was identified in all Cx. pipiens samples, while no single fungal genus was found in more than two samples. This study exemplifies the utility of RNA SMS in the characterization of the natural microbiota of mosquitoes and, in particular, the value of identifying all microbes associated with

  6. An inventory of the Aspergillus niger secretome by combining in silico predictions with shotgun proteomics data

    Directory of Open Access Journals (Sweden)

    Martens-Uzunova Elena S

    2010-10-01

    Full Text Available Abstract Background The ecological niche occupied by a fungal species, its pathogenicity and its usefulness as a microbial cell factory to a large degree depends on its secretome. Protein secretion usually requires the presence of a N-terminal signal peptide (SP and by scanning for this feature using available highly accurate SP-prediction tools, the fraction of potentially secreted proteins can be directly predicted. However, prediction of a SP does not guarantee that the protein is actually secreted and current in silico prediction methods suffer from gene-model errors introduced during genome annotation. Results A majority rule based classifier that also evaluates signal peptide predictions from the best homologs of three neighbouring Aspergillus species was developed to create an improved list of potential signal peptide containing proteins encoded by the Aspergillus niger genome. As a complement to these in silico predictions, the secretome associated with growth and upon carbon source depletion was determined using a shotgun proteomics approach. Overall, some 200 proteins with a predicted signal peptide were identified to be secreted proteins. Concordant changes in the secretome state were observed as a response to changes in growth/culture conditions. Additionally, two proteins secreted via a non-classical route operating in A. niger were identified. Conclusions We were able to improve the in silico inventory of A. niger secretory proteins by combining different gene-model predictions from neighbouring Aspergilli and thereby avoiding prediction conflicts associated with inaccurate gene-models. The expected accuracy of signal peptide prediction for proteins that lack homologous sequences in the proteomes of related species is 85%. An experimental validation of the predicted proteome confirmed in silico predictions.

  7. An inventory of the Aspergillus niger secretome by combining in silico predictions with shotgun proteomics data.

    Science.gov (United States)

    Braaksma, Machtelt; Martens-Uzunova, Elena S; Punt, Peter J; Schaap, Peter J

    2010-10-19

    The ecological niche occupied by a fungal species, its pathogenicity and its usefulness as a microbial cell factory to a large degree depends on its secretome. Protein secretion usually requires the presence of a N-terminal signal peptide (SP) and by scanning for this feature using available highly accurate SP-prediction tools, the fraction of potentially secreted proteins can be directly predicted. However, prediction of a SP does not guarantee that the protein is actually secreted and current in silico prediction methods suffer from gene-model errors introduced during genome annotation. A majority rule based classifier that also evaluates signal peptide predictions from the best homologs of three neighbouring Aspergillus species was developed to create an improved list of potential signal peptide containing proteins encoded by the Aspergillus niger genome. As a complement to these in silico predictions, the secretome associated with growth and upon carbon source depletion was determined using a shotgun proteomics approach. Overall, some 200 proteins with a predicted signal peptide were identified to be secreted proteins. Concordant changes in the secretome state were observed as a response to changes in growth/culture conditions. Additionally, two proteins secreted via a non-classical route operating in A. niger were identified. We were able to improve the in silico inventory of A. niger secretory proteins by combining different gene-model predictions from neighbouring Aspergilli and thereby avoiding prediction conflicts associated with inaccurate gene-models. The expected accuracy of signal peptide prediction for proteins that lack homologous sequences in the proteomes of related species is 85%. An experimental validation of the predicted proteome confirmed in silico predictions.

  8. Sex differences in shotgun proteome analyses for chronic oral intake of cadmium in mice.

    Directory of Open Access Journals (Sweden)

    Yoshiharu Yamanobe

    Full Text Available Environmental diseases related to cadmium exposure primarily develop owing to industrial wastewater pollution and/or contaminated food. In regions with high cadmium exposure in Japan, cadmium accumulation occurs primarily in the kidneys of individuals who are exposed to the metal. In contrast, in the itai-itai disease outbreak that occurred in the Jinzu River basin in Toyama Prefecture in Japan, cadmium primarily accumulated in the liver. On the other hand, high concentration of cadmium caused renal tubular disorder and osteomalacia (multiple bone fracture, probably resulting from the renal tubular dysfunction and additional pathology. In this study, we aimed to establish a mouse model of chronic cadmium intake. We administered cadmium-containing drinking water (32 mg/l to female and male mice ad libitum for 11 weeks. Metal analysis using inductively coupled plasma mass spectrometry revealed that cadmium accumulated in the kidneys (927 x 10 + 185 ng/g in females and 661 x 10 + 101 ng/g in males, liver (397 x 10 + 199 ng/g in females and 238 x 10 + 652 ng/g in males, and thyroid gland (293 + 93.7 ng/g in females and 129 + 72.7 ng/g in males of mice. Female mice showed higher cadmium accumulation in the kidney, liver, and thyroid gland than males did (p = 0.00345, p = 0.00213, and p = 0.0331, respectively. Shotgun proteome analyses after chronic oral administration of cadmium revealed that protein levels of glutathione S-transferase Mu2, Mu4, and Mu7 decreased in the liver, and those of A1 and A2 decreased in the kidneys in both female and male mice.

  9. Microbiological profile of chicken carcasses: A comparative analysis using shotgun metagenomic sequencing

    Directory of Open Access Journals (Sweden)

    Alessandra De Cesare

    2018-04-01

    Full Text Available In the last few years metagenomic and 16S rRNA sequencing have completly changed the microbiological investigations of food products. In this preliminary study, the microbiological profile of chicken carcasses collected from animals fed with different diets were tested by using shotgun metagenomic sequencing. A total of 15 carcasses have been collected at the slaughetrhouse at the end of the refrigeration tunnel from chickens reared for 35 days and fed with a control diet (n=5, a diet supplemented with 1500 FTU/kg of commercial phytase (n=5 and a diet supplemented with 1500 FTU/kg of commercial phytase and 3g/kg of inositol (n=5. Ten grams of neck and breast skin were obtained from each carcass and submited to total DNA extraction by using the DNeasy Blood & Tissue Kit (Qiagen. Sequencing libraries have been prepared by using the Nextera XT DNA Library Preparation Kit (Illumina and sequenced in a HiScanSQ (Illumina at 100 bp in paired ends. A number of sequences ranging between 5 and 9 million was obtained for each sample. Sequence analysis showed that Proteobacteria and Firmicutes represented more than 98% of whole bacterial populations associated to carcass skin in all groups but their abundances were different between groups. Moraxellaceae and other degradative bacteria showed a significantly higher abundance in the control compared to the treated groups. Furthermore, Clostridium perfringens showed a relative frequency of abundance significantly higher in the group fed with phytase and Salmonella enterica in the group fed with phytase plus inositol. The results of this preliminary study showed that metagenome sequencing is suitable to investigate and monitor carcass microbiota in order to detect specific pathogenic and/or degradative populations.

  10. A linear programming model for protein inference problem in shotgun proteomics.

    Science.gov (United States)

    Huang, Ting; He, Zengyou

    2012-11-15

    Assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is an important issue in shotgun proteomics. The objective of protein inference is to find a subset of proteins that are truly present in the sample. Although many methods have been proposed for protein inference, several issues such as peptide degeneracy still remain unsolved. In this article, we present a linear programming model for protein inference. In this model, we use a transformation of the joint probability that each peptide/protein pair is present in the sample as the variable. Then, both the peptide probability and protein probability can be expressed as a formula in terms of the linear combination of these variables. Based on this simple fact, the protein inference problem is formulated as an optimization problem: minimize the number of proteins with non-zero probabilities under the constraint that the difference between the calculated peptide probability and the peptide probability generated from peptide identification algorithms should be less than some threshold. This model addresses the peptide degeneracy issue by forcing some joint probability variables involving degenerate peptides to be zero in a rigorous manner. The corresponding inference algorithm is named as ProteinLP. We test the performance of ProteinLP on six datasets. Experimental results show that our method is competitive with the state-of-the-art protein inference algorithms. The source code of our algorithm is available at: https://sourceforge.net/projects/prolp/. zyhe@dlut.edu.cn. Supplementary data are available at Bioinformatics Online.

  11. A framework for intelligent data acquisition and real-time database searching for shotgun proteomics.

    Science.gov (United States)

    Graumann, Johannes; Scheltema, Richard A; Zhang, Yong; Cox, Jürgen; Mann, Matthias

    2012-03-01

    In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides "on-the-fly" within 30 ms, well within the time constraints of a shotgun fragmentation "topN" method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available.

  12. Shotgun proteomics reveals physiological response to ocean acidification in Crassostrea gigas.

    Science.gov (United States)

    Timmins-Schiffman, Emma; Coffey, William D; Hua, Wilber; Nunn, Brook L; Dickinson, Gary H; Roberts, Steven B

    2014-11-03

    Ocean acidification as a result of increased anthropogenic CO2 emissions is occurring in marine and estuarine environments worldwide. The coastal ocean experiences additional daily and seasonal fluctuations in pH that can be lower than projected end-of-century open ocean pH reductions. In order to assess the impact of ocean acidification on marine invertebrates, Pacific oysters (Crassostrea gigas) were exposed to one of four different p CO2 levels for four weeks: 400 μatm (pH 8.0), 800 μatm (pH 7.7), 1000 μatm (pH 7.6), or 2800 μatm (pH 7.3). At the end of the four week exposure period, oysters in all four p CO2 environments deposited new shell, but growth rate was not different among the treatments. However, micromechanical properties of the new shell were compromised by elevated p CO2. Elevated p CO2 affected neither whole body fatty acid composition, nor glycogen content, nor mortality rate associated with acute heat shock. Shotgun proteomics revealed that several physiological pathways were significantly affected by ocean acidification, including antioxidant response, carbohydrate metabolism, and transcription and translation. Additionally, the proteomic response to a second stress differed with p CO2, with numerous processes significantly affected by mechanical stimulation at high versus low p CO2 (all proteomics data are available in the ProteomeXchange under the identifier PXD000835). Oyster physiology is significantly altered by exposure to elevated p CO2, indicating changes in energy resource use. This is especially apparent in the assessment of the effects of p CO2 on the proteomic response to a second stress. The altered stress response illustrates that ocean acidification may impact how oysters respond to other changes in their environment. These data contribute to an integrative view of the effects of ocean acidification on oysters as well as physiological trade-offs during environmental stress.

  13. Survey of bacterial diversity in chronic wounds using Pyrosequencing, DGGE, and full ribosome shotgun sequencing

    Directory of Open Access Journals (Sweden)

    Wolcott Benjamin M

    2008-03-01

    Full Text Available Abstract Background Chronic wound pathogenic biofilms are host-pathogen environments that colonize and exist as a cohabitation of many bacterial species. These bacterial populations cooperate to promote their own survival and the chronic nature of the infection. Few studies have performed extensive surveys of the bacterial populations that occur within different types of chronic wound biofilms. The use of 3 separate16S-based molecular amplifications followed by pyrosequencing, shotgun Sanger sequencing, and denaturing gradient gel electrophoresis were utilized to survey the major populations of bacteria that occur in the pathogenic biofilms of three types of chronic wound types: diabetic foot ulcers (D, venous leg ulcers (V, and pressure ulcers (P. Results There are specific major populations of bacteria that were evident in the biofilms of all chronic wound types, including Staphylococcus, Pseudomonas, Peptoniphilus, Enterobacter, Stenotrophomonas, Finegoldia, and Serratia spp. Each of the wound types reveals marked differences in bacterial populations, such as pressure ulcers in which 62% of the populations were identified as obligate anaerobes. There were also populations of bacteria that were identified but not recognized as wound pathogens, such as Abiotrophia para-adiacens and Rhodopseudomonas spp. Results of molecular analyses were also compared to those obtained using traditional culture-based diagnostics. Only in one wound type did culture methods correctly identify the primary bacterial population indicating the need for improved diagnostic methods. Conclusion If clinicians can gain a better understanding of the wound's microbiota, it will give them a greater understanding of the wound's ecology and will allow them to better manage healing of the wound improving the prognosis of patients. This research highlights the necessity to begin evaluating, studying, and treating chronic wound pathogenic biofilms as multi-species entities in

  14. Development of a Schistosoma mansoni shotgun O-glycan microarray and application to the discovery of new antigenic schistosome glycan motifs.

    Science.gov (United States)

    van Diepen, Angela; van der Plas, Arend-Jan; Kozak, Radoslaw P; Royle, Louise; Dunne, David W; Hokke, Cornelis H

    2015-06-01

    Upon infection with Schistosoma, antibody responses are mounted that are largely directed against glycans. Over the last few years significant progress has been made in characterising the antigenic properties of N-glycans of Schistosoma mansoni. Despite also being abundantly expressed by schistosomes, much less is understood about O-glycans and antibody responses to these have not yet been systematically analysed. Antibody binding to schistosome glycans can be analysed efficiently and quantitatively using glycan microarrays, but O-glycan array construction and exploration is lagging behind because no universal O-glycanase is available, and release of O-glycans has been dependent on chemical methods. Recently, a modified hydrazinolysis method has been developed that allows the release of O-glycans with free reducing termini and limited degradation, and we applied this method to obtain O-glycans from different S. mansoni life stages. Two-dimensional HPLC separation of 2-aminobenzoic acid-labelled O-glycans generated 362 O-glycan-containing fractions that were printed on an epoxide-modified glass slide, thereby generating the first shotgun O-glycan microarray containing naturally occurring schistosome O-glycans. Monoclonal antibodies and mass spectrometry showed that the O-glycan microarray contains well-known antigenic glycan motifs as well as numerous other, potentially novel, antibody targets. Incubations of the microarrays with sera from Schistosoma-infected humans showed substantial antibody responses to O-glycans in addition to those observed to the previously investigated N- and glycosphingolipid glycans. This underlines the importance of the inclusion of these often schistosome-specific O-glycans in glycan antigen studies and indicates that O-glycans contain novel antigenic motifs that have potential for use in diagnostic methods and studies aiming at the discovery of vaccine targets. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights

  15. Quantitative phylogenetic assessment of microbial communities indiverse environments

    Energy Technology Data Exchange (ETDEWEB)

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  16. Diversity of thermophiles in a Malaysian hot spring determined using 16S rRNA and shotgun metagenome sequencing.

    Science.gov (United States)

    Chan, Chia Sing; Chan, Kok-Gan; Tay, Yea-Ling; Chua, Yi-Heng; Goh, Kian Mau

    2015-01-01

    The Sungai Klah (SK) hot spring is the second hottest geothermal spring in Malaysia. This hot spring is a shallow, 150-m-long, fast-flowing stream, with temperatures varying from 50 to 110°C and a pH range of 7.0-9.0. Hidden within a wooded area, the SK hot spring is continually fed by plant litter, resulting in a relatively high degree of total organic content (TOC). In this study, a sample taken from the middle of the stream was analyzed at the 16S rRNA V3-V4 region by amplicon metagenome sequencing. Over 35 phyla were detected by analyzing the 16S rRNA data. Firmicutes and Proteobacteria represented approximately 57% of the microbiome. Approximately 70% of the detected thermophiles were strict anaerobes; however, Hydrogenobacter spp., obligate chemolithotrophic thermophiles, represented one of the major taxa. Several thermophilic photosynthetic microorganisms and acidothermophiles were also detected. Most of the phyla identified by 16S rRNA were also found using the shotgun metagenome approaches. The carbon, sulfur, and nitrogen metabolism within the SK hot spring community were evaluated by shotgun metagenome sequencing, and the data revealed diversity in terms of metabolic activity and dynamics. This hot spring has a rich diversified phylogenetic community partly due to its natural environment (plant litter, high TOC, and a shallow stream) and geochemical parameters (broad temperature and pH range). It is speculated that symbiotic relationships occur between the members of the community.

  17. Data on xylem sap proteins from Mn- and Fe-deficient tomato plants obtained using shotgun proteomics.

    Science.gov (United States)

    Ceballos-Laita, Laura; Gutierrez-Carbonell, Elain; Takahashi, Daisuke; Abadía, Anunciación; Uemura, Matsuo; Abadía, Javier; López-Millán, Ana Flor

    2018-04-01

    This article contains consolidated proteomic data obtained from xylem sap collected from tomato plants grown in Fe- and Mn-sufficient control, as well as Fe-deficient and Mn-deficient conditions. Data presented here cover proteins identified and quantified by shotgun proteomics and Progenesis LC-MS analyses: proteins identified with at least two peptides and showing changes statistically significant (ANOVA; p ≤ 0.05) and above a biologically relevant selected threshold (fold ≥ 2) between treatments are listed. The comparison between Fe-deficient, Mn-deficient and control xylem sap samples using a multivariate statistical data analysis (Principal Component Analysis, PCA) is also included. Data included in this article are discussed in depth in the research article entitled "Effects of Fe and Mn deficiencies on the protein profiles of tomato ( Solanum lycopersicum) xylem sap as revealed by shotgun analyses" [1]. This dataset is made available to support the cited study as well to extend analyses at a later stage.

  18. Effects of Fe and Mn deficiencies on the protein profiles of tomato (Solanum lycopersicum) xylem sap as revealed by shotgun analyses

    Science.gov (United States)

    The aim of this work was to study the effects of Fe and Mn deficiencies on the xylem sap proteome of tomato using a shotgun proteomic approach, with the final goal of elucidating plant response mechanisms to these stresses. This approach yielded 643 proteins reliably identified and quantified with 7...

  19. Identification of Proteins Related to Epigenetic Regulation in the Malignant Transformation of Aberrant Karyotypic Human Embryonic Stem Cells by Quantitative Proteomics

    Science.gov (United States)

    Sun, Yi; Yang, Yixuan; Zeng, Sicong; Tan, Yueqiu; Lu, Guangxiu; Lin, Ge

    2014-01-01

    Previous reports have demonstrated that human embryonic stem cells (hESCs) tend to develop genomic alterations and progress to a malignant state during long-term in vitro culture. This raises concerns of the clinical safety in using cultured hESCs. However, transformed hESCs might serve as an excellent model to determine the process of embryonic stem cell transition. In this study, ITRAQ-based tandem mass spectrometry was used to quantify normal and aberrant karyotypic hESCs proteins from simple to more complex karyotypic abnormalities. We identified and quantified 2583 proteins, and found that the expression levels of 316 proteins that represented at least 23 functional molecular groups were significantly different in both normal and abnormal hESCs. Dysregulated protein expression in epigenetic regulation was further verified in six pairs of hESC lines in early and late passage. In summary, this study is the first large-scale quantitative proteomic analysis of the malignant transformation of aberrant karyotypic hESCs. The data generated should serve as a useful reference of stem cell-derived tumor progression. Increased expression of both HDAC2 and CTNNB1 are detected as early as the pre-neoplastic stage, and might serve as prognostic markers in the malignant transformation of hESCs. PMID:24465727

  20. Application of whole genome shotgun sequencing for detection and characterization of genetically modified organisms and derived products.

    Science.gov (United States)

    Holst-Jensen, Arne; Spilsberg, Bjørn; Arulandhu, Alfred J; Kok, Esther; Shi, Jianxin; Zel, Jana

    2016-07-01

    The emergence of high-throughput, massive or next-generation sequencing technologies has created a completely new foundation for molecular analyses. Various selective enrichment processes are commonly applied to facilitate detection of predefined (known) targets. Such approaches, however, inevitably introduce a bias and are prone to miss unknown targets. Here we review the application of high-throughput sequencing technologies and the preparation of fit-for-purpose whole genome shotgun sequencing libraries for the detection and characterization of genetically modified and derived products. The potential impact of these new sequencing technologies for the characterization, breeding selection, risk assessment, and traceability of genetically modified organisms and genetically modified products is yet to be fully acknowledged. The published literature is reviewed, and the prospects for future developments and use of the new sequencing technologies for these purposes are discussed.

  1. Use of Metagenomic Shotgun Sequencing Technology To Detect Foodborne Pathogens within the Microbiome of the Beef Production Chain.

    Science.gov (United States)

    Yang, Xiang; Noyes, Noelle R; Doster, Enrique; Martin, Jennifer N; Linke, Lyndsey M; Magnuson, Roberta J; Yang, Hua; Geornaras, Ifigenia; Woerner, Dale R; Jones, Kenneth L; Ruiz, Jaime; Boucher, Christina; Morley, Paul S; Belk, Keith E

    2016-04-01

    Foodborne illnesses associated with pathogenic bacteria are a global public health and economic challenge. The diversity of microorganisms (pathogenic and nonpathogenic) that exists within the food and meat industries complicates efforts to understand pathogen ecology. Further, little is known about the interaction of pathogens within the microbiome throughout the meat production chain. Here, a metagenomic approach and shotgun sequencing technology were used as tools to detect pathogenic bacteria in environmental samples collected from the same groups of cattle at different longitudinal processing steps of the beef production chain: cattle entry to feedlot, exit from feedlot, cattle transport trucks, abattoir holding pens, and the end of the fabrication system. The log read counts classified as pathogens per million reads for Salmonella enterica,Listeria monocytogenes,Escherichia coli,Staphylococcus aureus, Clostridium spp. (C. botulinum and C. perfringens), and Campylobacter spp. (C. jejuni,C. coli, and C. fetus) decreased over subsequential processing steps. Furthermore, the normalized read counts for S. enterica,E. coli, and C. botulinumwere greater in the final product than at the feedlots, indicating that the proportion of these bacteria increased (the effect on absolute numbers was unknown) within the remaining microbiome. From an ecological perspective, data indicated that shotgun metagenomics can be used to evaluate not only the microbiome but also shifts in pathogen populations during beef production. Nonetheless, there were several challenges in this analysis approach, one of the main ones being the identification of the specific pathogen from which the sequence reads originated, which makes this approach impractical for use in pathogen identification for regulatory and confirmation purposes. Copyright © 2016 Yang et al.

  2. Pigs in sequence space: A 0.66X coverage pig genome survey based on shotgun sequencing

    Directory of Open Access Journals (Sweden)

    Li Wei

    2005-05-01

    Full Text Available Abstract Background Comparative whole genome analysis of Mammalia can benefit from the addition of more species. The pig is an obvious choice due to its economic and medical importance as well as its evolutionary position in the artiodactyls. Results We have generated ~3.84 million shotgun sequences (0.66X coverage from the pig genome. The data are hereby released (NCBI Trace repository with center name "SDJVP", and project name "Sino-Danish Pig Genome Project" together with an initial evolutionary analysis. The non-repetitive fraction of the sequences was aligned to the UCSC human-mouse alignment and the resulting three-species alignments were annotated using the human genome annotation. Ultra-conserved elements and miRNAs were identified. The results show that for each of these types of orthologous data, pig is much closer to human than mouse is. Purifying selection has been more efficient in pig compared to human, but not as efficient as in mouse, and pig seems to have an isochore structure most similar to the structure in human. Conclusion The addition of the pig to the set of species sequenced at low coverage adds to the understanding of selective pressures that have acted on the human genome by bisecting the evolutionary branch between human and mouse with the mouse branch being approximately 3 times as long as the human branch. Additionally, the joint alignment of the shot-gun sequences to the human-mouse alignment offers the investigator a rapid way to defining specific regions for analysis and resequencing.

  3. Proteomic analysis of protein interactions between Eimeria maxima sporozoites and chicken jejunal epithelial cells by shotgun LC-MS/MS.

    Science.gov (United States)

    Huang, Jingwei; Liu, Tingqi; Li, Ke; Song, Xiaokai; Yan, Ruofeng; Xu, Lixin; Li, Xiangrui

    2018-04-04

    Eimeria maxima initiates infection by invading the jejunal epithelial cells of chicken. However, the proteins involved in invasion remain unknown. The research of the molecules that participate in the interactions between E. maxima sporozoites and host target cells will fill a gap in our understanding of the invasion system of this parasitic pathogen. In the present study, chicken jejunal epithelial cells were isolated and cultured in vitro. Western blot was employed to analyze the soluble proteins of E. maxima sporozoites that bound to chicken jejunal epithelial cells. Co-immunoprecipitation (co-IP) assay was used to separate the E. maxima proteins that bound to chicken jejunal epithelial cells. Shotgun LC-MS/MS technique was used for proteomics identification and Gene Ontology was employed for the bioinformatics analysis. The results of Western blot analysis showed that four proteins bands from jejunal epithelial cells co-cultured with soluble proteins of E. maxima sporozoites were recognized by the positive sera, with molecular weights of 70, 90, 95 and 130 kDa. The co-IP dilutions were analyzed by shotgun LC-MS/MS. A total of 204 proteins were identified in the E. maxima protein database using the MASCOT search engine. Thirty-five proteins including microneme protein 3 and 7 had more than two unique peptide counts and were annotated using Gene Ontology for molecular function, biological process and cellular localization. The results revealed that of the 35 annotated peptides, 22 (62.86%) were associated with binding activity and 15 (42.86%) were involved in catalytic activity. Our findings provide an insight into the interaction between E. maxima and the corresponding host cells and it is important for the understanding of molecular mechanisms underlying E. maxima invasion.

  4. Diversity of thermophiles in a Malaysian hot spring determined using 16S rRNA and shotgun metagenome sequencing

    Directory of Open Access Journals (Sweden)

    Chia Sing eChan

    2015-03-01

    Full Text Available The Sungai Klah (SK hot spring is the second hottest geothermal spring in Malaysia. This hot spring is a shallow, 150-meter-long, fast-flowing stream, with temperatures varying from 50 to 110°C and a pH range of 7.0 to 9.0. Hidden within a wooded area, the SK hot spring is continually fed by plant litter, resulting in a relatively high degree of total organic content (TOC. In this study, a sample taken from the middle of the stream was analyzed at the 16S rRNA V3−V4 region by amplicon metagenome sequencing. Over 35 phyla were detected by analyzing the 16S rRNA data. Firmicutes and Proteobacteria represented approximately 57% of the microbiome. Approximately 70% of the detected thermophiles were strict anaerobes; however, Hydrogenobacter spp., obligate chemolithotrophic thermophiles, represented one of the major taxa. Several thermophilic photosynthetic microorganisms and acidothermophiles were also detected. Most of the phyla identified by 16S rRNA were also found using the shotgun metagenome approaches. The carbon, sulfur, and nitrogen metabolism within the SK hot spring community were evaluated by shotgun metagenome sequencing, and the data revealed diversity in terms of metabolic activity and dynamics. This hot spring has a rich diversified phylogenetic community partly due to its natural environment (plant litter, high TOC, and a shallow stream and geochemical parameters (broad temperature and pH range. It is speculated that symbiotic relationships occur between the members of the community.

  5. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    Science.gov (United States)

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  6. Shotgun metagenomic data on the human stool samples to characterize shifts of the gut microbial profile after the Helicobacter pylori eradication therapy

    Directory of Open Access Journals (Sweden)

    Eugenia A. Boulygina

    2017-10-01

    Full Text Available The shotgun sequencing data presented in this report are related to the research article named “Gut microbiome shotgun sequencing in assessment of microbial community changes associated with H. pylori eradication therapy” (Khusnutdinova et al., 2016 [1]. Typically, the H. pylori eradication protocol includes a prolonged two-week use of the broad-spectrum antibiotics. The presented data on the whole-genome sequencing of the total DNA from stool samples of patients before the start of the eradication, immediately after eradication and several weeks after the end of treatment could help to profile the gut microbiota both taxonomically and functionally. The presented data together with those described in Glushchenko et al. (2017 [2] allow researchers to characterize the metagenomic profiles in which the use of antibiotics could result in dramatic changes in the intestinal microbiota composition. We perform 15 gut metagenomes from 5 patients with H. pylori infection, obtained through the shotgun sequencing on the SOLiD 5500 W platform. Raw reads are deposited in the ENA under project ID PRJEB21338.

  7. Application of the whole-transcriptome shotgun sequencing approach to the study of Philadelphia-positive acute lymphoblastic leukemia

    International Nuclear Information System (INIS)

    Iacobucci, I; Ferrarini, A; Sazzini, M; Giacomelli, E; Lonetti, A; Xumerle, L; Ferrari, A; Papayannidis, C; Malerba, G; Luiselli, D; Boattini, A; Garagnani, P; Vitale, A; Soverini, S; Pane, F; Baccarani, M; Delledonne, M; Martinelli, G

    2012-01-01

    Although the pathogenesis of BCR–ABL1-positive acute lymphoblastic leukemia (ALL) is mainly related to the expression of the BCR–ABL1 fusion transcript, additional cooperating genetic lesions are supposed to be involved in its development and progression. Therefore, in an attempt to investigate the complex landscape of mutations, changes in expression profiles and alternative splicing (AS) events that can be observed in such disease, the leukemia transcriptome of a BCR–ABL1-positive ALL patient at diagnosis and at relapse was sequenced using a whole-transcriptome shotgun sequencing (RNA-Seq) approach. A total of 13.9 and 15.8 million sequence reads was generated from de novo and relapsed samples, respectively, and aligned to the human genome reference sequence. This led to the identification of five validated missense mutations in genes involved in metabolic processes (DPEP1, TMEM46), transport (MVP), cell cycle regulation (ABL1) and catalytic activity (CTSZ), two of which resulted in acquired relapse variants. In all, 6390 and 4671 putative AS events were also detected, as well as expression levels for 18 315 and 18 795 genes, 28% of which were differentially expressed in the two disease phases. These data demonstrate that RNA-Seq is a suitable approach for identifying a wide spectrum of genetic alterations potentially involved in ALL

  8. Seasonal changes in the communities of photosynthetic picoeukaryotes in Ofunato Bay as revealed by shotgun metagenomic sequencing

    KAUST Repository

    Rashid, Jonaira

    2018-04-30

    Small photosynthetic eukaryotes play important roles in oceanic food webs in coastal regions. We investigated seasonal changes in the communities of photosynthetic picoeukaryotes (PPEs) of the class Mamiellophyceae, including the genera Bathycoccus, Micromonas and Ostreococcus, in Ofunato Bay, which is located in northeastern Japan and faces the Pacific Ocean. The abundances of PPEs were assessed over a period of one year in 2015 at three sampling stations, KSt. 1 (innermost bay area), KSt. 2 (middle bay area) and KSt. 3 (bay entrance area) at depths of 1 m (KSt. 1, KSt. 2 and KSt. 3), 8 m (KSt. 1) or 10 m (KSt. 2 and KSt. 3) by employing MiSeq shotgun metagenomic sequencing. The total abundances of Bathycoccus, Ostreococcus and Micromonas were in the ranges of 42–49%, 35–49% and 13–17%, respectively. Considering all assayed sampling stations and depths, seasonal changes revealed high abundances of PPEs during the winter and summer and low abundances during late winter to early spring and late summer to early autumn. Bathycoccus was most abundant in the winter, and Ostreococcus showed a high abundance during the summer. Another genus, Micromonas, was relatively low in abundance throughout the study period. Taken together with previously suggested blooming periods of phytoplankton, as revealed by chlorophyll a concentrations in Ofunato Bay during spring and late autumn, these results for PPEs suggest that greater phytoplankton blooming has a negative influence on the seasonal occurrences of PPEs in the bay.

  9. Streamlined Membrane Proteome Preparation for Shotgun Proteomics Analysis with Triton X-100 Cloud Point Extraction and Nanodiamond Solid Phase Extraction

    Directory of Open Access Journals (Sweden)

    Minh D. Pham

    2016-05-01

    Full Text Available While mass spectrometry (MS plays a key role in proteomics research, characterization of membrane proteins (MP by MS has been a challenging task because of the presence of a host of interfering chemicals in the hydrophobic protein extraction process, and the low protease digestion efficiency. We report a sample preparation protocol, two-phase separation with Triton X-100, induced by NaCl, with coomassie blue added for visualizing the detergent-rich phase, which streamlines MP preparation for SDS-PAGE analysis of intact MP and shot-gun proteomic analyses. MP solubilized in the detergent-rich milieu were then sequentially extracted and fractionated by surface-oxidized nanodiamond (ND at three pHs. The high MP affinity of ND enabled extensive washes for removal of salts, detergents, lipids, and other impurities to ensure uncompromised ensuing purposes, notably enhanced proteolytic digestion and down-stream mass spectrometric (MS analyses. Starting with a typical membranous cellular lysate fraction harvested with centrifugation/ultracentrifugation, MP purities of 70%, based on number (not weight of proteins identified by MS, was achieved; the weight-based purity can be expected to be much higher.

  10. Analysis of lipid experiments (ALEX: a software framework for analysis of high-resolution shotgun lipidomics data.

    Directory of Open Access Journals (Sweden)

    Peter Husen

    Full Text Available Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1. The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  11. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    Science.gov (United States)

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  12. Seasonal changes in the communities of photosynthetic picoeukaryotes in Ofunato Bay as revealed by shotgun metagenomic sequencing

    KAUST Repository

    Rashid, Jonaira; Kobiyama, Atsushi; Reza, Md. Shaheed; Yamada, Yuichiro; Ikeda, Yuri; Ikeda, Daisuke; Mizusawa, Nanami; Ikeo, Kazuho; Sato, Shigeru; Ogata, Takehiko; Kudo, Toshiaki; Kaga, Shinnosuke; Watanabe, Shiho; Naiki, Kimiaki; Kaga, Yoshimasa; Mineta, Katsuhiko; Bajic, Vladimir B.; Gojobori, Takashi; Watabe, Shugo

    2018-01-01

    Small photosynthetic eukaryotes play important roles in oceanic food webs in coastal regions. We investigated seasonal changes in the communities of photosynthetic picoeukaryotes (PPEs) of the class Mamiellophyceae, including the genera Bathycoccus, Micromonas and Ostreococcus, in Ofunato Bay, which is located in northeastern Japan and faces the Pacific Ocean. The abundances of PPEs were assessed over a period of one year in 2015 at three sampling stations, KSt. 1 (innermost bay area), KSt. 2 (middle bay area) and KSt. 3 (bay entrance area) at depths of 1 m (KSt. 1, KSt. 2 and KSt. 3), 8 m (KSt. 1) or 10 m (KSt. 2 and KSt. 3) by employing MiSeq shotgun metagenomic sequencing. The total abundances of Bathycoccus, Ostreococcus and Micromonas were in the ranges of 42–49%, 35–49% and 13–17%, respectively. Considering all assayed sampling stations and depths, seasonal changes revealed high abundances of PPEs during the winter and summer and low abundances during late winter to early spring and late summer to early autumn. Bathycoccus was most abundant in the winter, and Ostreococcus showed a high abundance during the summer. Another genus, Micromonas, was relatively low in abundance throughout the study period. Taken together with previously suggested blooming periods of phytoplankton, as revealed by chlorophyll a concentrations in Ofunato Bay during spring and late autumn, these results for PPEs suggest that greater phytoplankton blooming has a negative influence on the seasonal occurrences of PPEs in the bay.

  13. Discovering and differentiating new and emerging clonal populations of Chlamydia trachomatis with a novel shotgun cell culture harvest assay.

    Science.gov (United States)

    Somboonna, Naraporn; Mead, Sally; Liu, Jessica; Dean, Deborah

    2008-03-01

    Chlamydia trachomatis is the leading cause of preventable blindness and bacterial sexually transmitted diseases worldwide. Plaque assays have been used to clonally segregate laboratory-adapted C. trachomatis strains from mixed infections, but no assays have been reported to segregate clones from recent clinical samples. We developed a novel shotgun cell culture harvest assay for this purpose because we found that recent clinical samples do not form plaques. Clones were strain-typed by using outer membrane protein A and 16S rRNA sequences. Surprisingly, ocular trachoma reference strain A/SA-1 contained clones of Chlamydophila abortus. C. abortus primarily infects ruminants and pigs and has never been identified in populations where trachoma is endemic. Three clonal variants of reference strain Ba/Apache-2 were also identified. Our findings reflect the importance of clonal isolation in identifying constituents of mixed infections containing new or emerging strains and of viable clones for research to more fully understand the dynamics of in vivo strain-mixing, evolution, and disease pathogenesis.

  14. Nonoperative Management of Multiple Penetrating Cardiac and Colon Wounds from a Shotgun: A Case Report and Literature Review

    Directory of Open Access Journals (Sweden)

    Paula M. Jaramillo

    2018-01-01

    Full Text Available Introduction. Surgery for cardiac trauma is considered fatal and for wounds of the colon by associated sepsis is normally considered; however, conservative management of many traumatic lesions of different injured organs has progressed over the years. Presentation of the Case. A 65-year-old male patient presented with multiple shotgun wounds on the left upper limb, thorax, and abdomen. On evaluation, he was hemodynamically stable with normal sinus rhythm and normal blood pressure, no dyspnea, or abdominal pain. Computed tomography (CT scan of the chest shows hematoma around the aorta without injury to the blood vessel wall with an intramyocardial projectile without pericardial effusion. CT scan of the abdomen showed pellets in the transverse colon and descending colon endoluminal without extravasation of contrast medium or intra-abdominal fluid. The patient remains hemodynamically stable, and nonsurgical procedure was established. Discussion. Patients with asymptomatic intramyocardial projectiles can be safely managed without surgery. Nonsurgical management is only possible in asymptomatic patients with trauma of the colon through close surveillance and with very selective patients since standard management is surgery. Conclusion. Nonsurgical management of cardiac trauma, as well as colon penetrating trauma, can be performed in carefully selected patients with proper clinical follow-up, imaging, and laboratory studies.

  15. Initial characterization of the large genome of the salamander Ambystoma mexicanum using shotgun and laser capture chromosome sequencing.

    Science.gov (United States)

    Keinath, Melissa C; Timoshevskiy, Vladimir A; Timoshevskaya, Nataliya Y; Tsonis, Panagiotis A; Voss, S Randal; Smith, Jeramiah J

    2015-11-10

    Vertebrates exhibit substantial diversity in genome size, and some of the largest genomes exist in species that uniquely inform diverse areas of basic and biomedical research. For example, the salamander Ambystoma mexicanum (the Mexican axolotl) is a model organism for studies of regeneration, development and genome evolution, yet its genome is ~10× larger than the human genome. As part of a hierarchical approach toward improving genome resources for the species, we generated 600 Gb of shotgun sequence data and developed methods for sequencing individual laser-captured chromosomes. Based on these data, we estimate that the A. mexicanum genome is ~32 Gb. Notably, as much as 19 Gb of the A. mexicanum genome can potentially be considered single copy, which presumably reflects the evolutionary diversification of mobile elements that accumulated during an ancient episode of genome expansion. Chromosome-targeted sequencing permitted the development of assemblies within the constraints of modern computational platforms, allowed us to place 2062 genes on the two smallest A. mexicanum chromosomes and resolves key events in the history of vertebrate genome evolution. Our analyses show that the capture and sequencing of individual chromosomes is likely to provide valuable information for the systematic sequencing, assembly and scaffolding of large genomes.

  16. Shotgun proteomics deciphered age/division of labor-related functional specification of three honeybee (Apis mellifera L.) exocrine glands.

    Science.gov (United States)

    Fujita, Toshiyuki; Kozuka-Hata, Hiroko; Hori, Yutaro; Takeuchi, Jun; Kubo, Takeo; Oyama, Masaaki

    2018-01-01

    The honeybee (Apis mellifera L.) uses various chemical signals produced by the worker exocrine glands to maintain the functioning of its colony. The roles of worker postcerebral glands (PcGs), thoracic glands (TGs), and mandibular glands (MGs) and the functional changes they undergo according to the division of labor from nursing to foraging are not as well studied. To comprehensively characterize the molecular roles of these glands in workers and their changes according to the division of labor of workers, we analyzed the proteomes of PcGs, TGs, and MGs from nurse bees and foragers using shotgun proteomics technology. We identified approximately 2000 proteins from each of the nurse bee or forager glands and highlighted the features of these glands at the molecular level by semiquantitative enrichment analyses of frequently detected, gland-selective, and labor-selective proteins. First, we found the high potential to produce lipids in PcGs and MGs, suggesting their relation to pheromone production. Second, we also found the proton pumps abundant in TGs and propose some transporters possibly related to the saliva production. Finally, our data unveiled candidate enzymes involved in labor-dependent acid production in MGs.

  17. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  18. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  19. Unbiased RNA Shotgun Metagenomics in Social and Solitary Wild Bees Detects Associations with Eukaryote Parasites and New Viruses.

    Directory of Open Access Journals (Sweden)

    Karel Schoonvaere

    Full Text Available The diversity of eukaryote organisms and viruses associated with wild bees remains poorly characterized in contrast to the well-documented pathosphere of the western honey bee, Apis mellifera. Using a deliberate RNA shotgun metagenomic sequencing strategy in combination with a dedicated bioinformatics workflow, we identified the (micro-organisms and viruses associated with two bumble bee hosts, Bombus terrestris and Bombus pascuorum, and two solitary bee hosts, Osmia cornuta and Andrena vaga. Ion Torrent semiconductor sequencing generated approximately 3.8 million high quality reads. The most significant eukaryote associations were two protozoan, Apicystis bombi and Crithidia bombi, and one nematode parasite Sphaerularia bombi in bumble bees. The trypanosome protozoan C. bombi was also found in the solitary bee O. cornuta. Next to the identification of three honey bee viruses Black queen cell virus, Sacbrood virus and Varroa destructor virus-1 and four plant viruses, we describe two novel RNA viruses Scaldis River bee virus (SRBV and Ganda bee virus (GABV based on their partial genomic sequences. The novel viruses belong to the class of negative-sense RNA viruses, SRBV is related to the order Mononegavirales whereas GABV is related to the family Bunyaviridae. The potential biological role of both viruses in bees is discussed in the context of recent advances in the field of arthropod viruses. Further, fragmentary sequence evidence for other undescribed viruses is presented, among which a nudivirus in O. cornuta and an unclassified virus related to Chronic bee paralysis virus in B. terrestris. Our findings extend the current knowledge of wild bee parasites in general and addsto the growing evidence of unexplored arthropod viruses in valuable insects.

  20. Characterization of Foodborne Strains of Staphylococcus aureus by Shotgun Proteomics: Functional Networks, Virulence Factors and Species-Specific Peptide Biomarkers

    Science.gov (United States)

    Carrera, Mónica; Böhme, Karola; Gallardo, José M.; Barros-Velázquez, Jorge; Cañas, Benito; Calo-Mata, Pilar

    2017-01-01

    In the present work, we applied a shotgun proteomics approach for the fast and easy characterization of 20 different foodborne strains of Staphylococcus aureus (S. aureus), one of the most recognized foodborne pathogenic bacteria. A total of 644 non-redundant proteins were identified and analyzed via an easy and rapid protein sample preparation procedure. The results allowed the differentiation of several proteome datasets from the different strains (common, accessory, and unique datasets), which were used to determine relevant functional pathways and differentiate the strains into different Euclidean hierarchical clusters. Moreover, a predicted protein-protein interaction network of the foodborne S. aureus strains was created. The whole confidence network contains 77 nodes and 769 interactions. Most of the identified proteins were surface-associated proteins that were related to pathways and networks of energy, lipid metabolism and virulence. Twenty-seven virulence factors were identified, and most of them corresponded to autolysins, N-acetylmuramoyl-L-alanine amidases, phenol-soluble modulins, extracellular fibrinogen-binding proteins and virulence factor EsxA. Potential species-specific peptide biomarkers were screened. Twenty-one species-specific peptide biomarkers, belonging to eight different proteins (nickel-ABC transporter, N-acetylmuramoyl-L-alanine amidase, autolysin, clumping factor A, gram-positive signal peptide YSIRK, cysteine protease/staphopain, transcriptional regulator MarR, and transcriptional regulator Sar-A), were proposed to identify S. aureus. These results constitute the first major dataset of peptides and proteins of foodborne S. aureus strains. This repository may be useful for further studies, for the development of new therapeutic treatments for S. aureus food intoxications and for microbial source-tracking in foodstuffs. PMID:29312172

  1. Prediction of the neuropeptidomes of members of the Astacidea (Crustacea, Decapoda) using publicly accessible transcriptome shotgun assembly (TSA) sequence data.

    Science.gov (United States)

    Christie, Andrew E; Chi, Megan

    2015-12-01

    The decapod infraorder Astacidea is comprised of clawed lobsters and freshwater crayfish. Due to their economic importance and their use as models for investigating neurochemical signaling, much work has focused on elucidating their neurochemistry, particularly their peptidergic systems. Interestingly, no astacidean has been the subject of large-scale peptidomic analysis via in silico transcriptome mining, this despite growing transcriptomic resources for members of this taxon. Here, the publicly accessible astacidean transcriptome shotgun assembly data were mined for putative peptide-encoding transcripts; these sequences were used to predict the structures of mature neuropeptides. One hundred seventy-six distinct peptides were predicted for Procambarus clarkii, including isoforms of adipokinetic hormone-corazonin-like peptide (ACP), allatostatin A (AST-A), allatostatin B, allatostatin C (AST-C) bursicon α, bursicon β, CCHamide, crustacean hyperglycemic hormone (CHH)/ion transport peptide (ITP), diuretic hormone 31 (DH31), eclosion hormone (EH), FMRFamide-like peptide, GSEFLamide, intocin, leucokinin, neuroparsin, neuropeptide F, pigment dispersing hormone, pyrokinin, RYamide, short neuropeptide F (sNPF), SIFamide, sulfakinin and tachykinin-related peptide (TRP). Forty-six distinct peptides, including isoforms of AST-A, AST-C, bursicon α, CCHamide, CHH/ITP, DH31, EH, intocin, myosuppressin, neuroparsin, red pigment concentrating hormone, sNPF and TRP, were predicted for Pontastacus leptodactylus, with a bursicon β and a neuroparsin predicted for Cherax quadricarinatus. The identification of ACP is the first from a decapod, while the predictions of CCHamide, EH, GSEFLamide, intocin, neuroparsin and RYamide are firsts for the Astacidea. Collectively, these data greatly expand the catalog of known astacidean neuropeptides and provide a foundation for functional studies of peptidergic signaling in members of this decapod infraorder. Copyright © 2015 Elsevier Inc

  2. Unbiased RNA Shotgun Metagenomics in Social and Solitary Wild Bees Detects Associations with Eukaryote Parasites and New Viruses.

    Science.gov (United States)

    Schoonvaere, Karel; De Smet, Lina; Smagghe, Guy; Vierstraete, Andy; Braeckman, Bart P; de Graaf, Dirk C

    2016-01-01

    The diversity of eukaryote organisms and viruses associated with wild bees remains poorly characterized in contrast to the well-documented pathosphere of the western honey bee, Apis mellifera. Using a deliberate RNA shotgun metagenomic sequencing strategy in combination with a dedicated bioinformatics workflow, we identified the (micro-)organisms and viruses associated with two bumble bee hosts, Bombus terrestris and Bombus pascuorum, and two solitary bee hosts, Osmia cornuta and Andrena vaga. Ion Torrent semiconductor sequencing generated approximately 3.8 million high quality reads. The most significant eukaryote associations were two protozoan, Apicystis bombi and Crithidia bombi, and one nematode parasite Sphaerularia bombi in bumble bees. The trypanosome protozoan C. bombi was also found in the solitary bee O. cornuta. Next to the identification of three honey bee viruses Black queen cell virus, Sacbrood virus and Varroa destructor virus-1 and four plant viruses, we describe two novel RNA viruses Scaldis River bee virus (SRBV) and Ganda bee virus (GABV) based on their partial genomic sequences. The novel viruses belong to the class of negative-sense RNA viruses, SRBV is related to the order Mononegavirales whereas GABV is related to the family Bunyaviridae. The potential biological role of both viruses in bees is discussed in the context of recent advances in the field of arthropod viruses. Further, fragmentary sequence evidence for other undescribed viruses is presented, among which a nudivirus in O. cornuta and an unclassified virus related to Chronic bee paralysis virus in B. terrestris. Our findings extend the current knowledge of wild bee parasites in general and addsto the growing evidence of unexplored arthropod viruses in valuable insects.

  3. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  4. Transposon fingerprinting using low coverage whole genome shotgun sequencing in cacao (Theobroma cacao L.) and related species.

    Science.gov (United States)

    Sveinsson, Saemundur; Gill, Navdeep; Kane, Nolan C; Cronk, Quentin

    2013-07-24

    Transposable elements (TEs) and other repetitive elements are a large and dynamically evolving part of eukaryotic genomes, especially in plants where they can account for a significant proportion of genome size. Their dynamic nature gives them the potential for use in identifying and characterizing crop germplasm. However, their repetitive nature makes them challenging to study using conventional methods of molecular biology. Next generation sequencing and new computational tools have greatly facilitated the investigation of TE variation within species and among closely related species. (i) We generated low-coverage Illumina whole genome shotgun sequencing reads for multiple individuals of cacao (Theobroma cacao) and related species. These reads were analysed using both an alignment/mapping approach and a de novo (graph based clustering) approach. (ii) A standard set of ultra-conserved orthologous sequences (UCOS) standardized TE data between samples and provided phylogenetic information on the relatedness of samples. (iii) The mapping approach proved highly effective within the reference species but underestimated TE abundance in interspecific comparisons relative to the de novo methods. (iv) Individual T. cacao accessions have unique patterns of TE abundance indicating that the TE composition of the genome is evolving actively within this species. (v) LTR/Gypsy elements are the most abundant, comprising c.10% of the genome. (vi) Within T. cacao the retroelement families show an order of magnitude greater sequence variability than the DNA transposon families. (vii) Theobroma grandiflorum has a similar TE composition to T. cacao, but the related genus Herrania is rather different, with LTRs making up a lower proportion of the genome, perhaps because of a massive presence (c. 20%) of distinctive low complexity satellite-like repeats in this genome. (i) Short read alignment/mapping to reference TE contigs provides a simple and effective method of investigating

  5. Acoustic Shotgun System

    Science.gov (United States)

    2009-09-16

    dispersing a plurality of relatively small, supercavitating projectiles in the water over a wide spatial field at long ranges from an underwater gun...or surface gun. (2) Description of the Prior Art [0004] One major technical challenge related to employing supercavitating projectiles against...accordingly is more limited. [0005] A second problem common to supercavitating projectiles is the configuration of the projectile itself. The primary

  6. Source-pathway-receptor investigation of the fate of trace elements derived from shotgun pellets discharged in terrestrial ecosystems managed for game shooting

    International Nuclear Information System (INIS)

    Sneddon, Jennifer; Clemente, Rafael; Riby, Philip; Lepp, Nicholas W.

    2009-01-01

    Spent shotgun pellets may contaminate terrestrial ecosystems. We examined the fate of elements originating from shotgun pellets in pasture and woodland ecosystems. Two source-receptor pathways: i) soil-soil pore water-plant and ii) whole earthworm/worm gut contents - washed and unwashed small mammal hair were investigated. Concentrations of Pb and associated contaminants were higher in soils from shot areas than controls. Arsenic and lead concentrations were positively correlated in soils, soil pore water and associated biota. Element concentrations in biota were below statutory levels in all locations. Bioavailability of lead to small mammals, based on concentrations in washed body hair was low. Lead movement from soil water to higher trophic levels was minor compared to lead adsorbed onto body surfaces. Lead was concentrated in earthworm gut and some plants. Results indicate that managed game shooting presents minimal risk in terms of element transfer to soils and their associated biota. - Source-receptor pathway analysis of a managed game shooting site showed no environmental risk of trace element transfer.

  7. Source-pathway-receptor investigation of the fate of trace elements derived from shotgun pellets discharged in terrestrial ecosystems managed for game shooting

    Energy Technology Data Exchange (ETDEWEB)

    Sneddon, Jennifer [School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF (United Kingdom); Clemente, Rafael, E-mail: rclemente@cebas.csic.e [School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF (United Kingdom); Riby, Philip [School of Pharmacy and Chemistry, Liverpool John Moores University, Liverpool L3 3AF (United Kingdom); Lepp, Nicholas W., E-mail: n.w.lepp@ljmu.ac.u [School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF (United Kingdom)

    2009-10-15

    Spent shotgun pellets may contaminate terrestrial ecosystems. We examined the fate of elements originating from shotgun pellets in pasture and woodland ecosystems. Two source-receptor pathways: i) soil-soil pore water-plant and ii) whole earthworm/worm gut contents - washed and unwashed small mammal hair were investigated. Concentrations of Pb and associated contaminants were higher in soils from shot areas than controls. Arsenic and lead concentrations were positively correlated in soils, soil pore water and associated biota. Element concentrations in biota were below statutory levels in all locations. Bioavailability of lead to small mammals, based on concentrations in washed body hair was low. Lead movement from soil water to higher trophic levels was minor compared to lead adsorbed onto body surfaces. Lead was concentrated in earthworm gut and some plants. Results indicate that managed game shooting presents minimal risk in terms of element transfer to soils and their associated biota. - Source-receptor pathway analysis of a managed game shooting site showed no environmental risk of trace element transfer.

  8. In-Depth Analysis of Exoproteomes from Marine Bacteria by Shotgun Liquid Chromatography-Tandem Mass Spectrometry: the Ruegeria pomeroyi DSS-3 Case-Study

    Directory of Open Access Journals (Sweden)

    Jean Armengaud

    2010-07-01

    Full Text Available Microorganisms secrete into their extracellular environment numerous compounds that are required for their survival. Many of these compounds could be of great interest for biotechnology applications and their genes used in synthetic biology design. The secreted proteins and the components of the translocation systems themselves can be scrutinized in-depth by the most recent proteomic tools. While the secretomes of pathogens are well-documented, those of non-pathogens remain largely to be established. Here, we present the analysis of the exoproteome from the marine bacterium Ruegeria pomeroyi DSS-3 grown in standard laboratory conditions. We used a shotgun approach consisting of trypsin digestion of the exoproteome, and identification of the resulting peptides by liquid chromatography coupled to tandem mass spectrometry. Three different proteins that have domains homologous to those observed in RTX toxins were uncovered and were semi-quantified as the most abundantly secreted proteins. One of these proteins clearly stands out from the catalogue, representing over half of the total exoproteome. We also listed many soluble proteins related to ABC and TRAP transporters implied in the uptake of nutrients. The Ruegeria pomeroyi DSS-3 case-study illustrates the power of the shotgun nano-LC-MS/MS strategy to decipher the exoproteome from marine bacteria and to contribute to environmental proteomics.

  9. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  10. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  11. Quantitative Thermochronology

    Science.gov (United States)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  12. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  13. Quantitative proteomic analysis of ibuprofen-degrading Patulibacter sp. strain I11

    DEFF Research Database (Denmark)

    Almeida, Barbara; Kjeldal, Henrik; Lolas, Ihab Bishara Yousef

    2013-01-01

    was identified and quantified by gel based shotgun-proteomics. In total 251 unique proteins were quantitated using this approach. Biological process and pathway analysis indicated a number of proteins that were up-regulated in response to active degradation of ibuprofen, some of them are known to be involved...... in the degradation of aromatic compounds. Data analysis revealed that several of these proteins are likely involved in ibuprofen degradation by Patulibacter sp. strain I11.......Ibuprofen is the third most consumed pharmaceutical drug in the world. Several isolates have been shown to degrade ibuprofen, but very little is known about the biochemistry of this process. This study investigates the degradation of ibuprofen by Patulibacter sp. strain I11 by quantitative...

  14. Development of 13 microsatellites for Gunnison Sage-grouse (Centrocercus minimus) using next-generation shotgun sequencing and their utility in Greater Sage-grouse (Centrocercus urophasianus)

    Science.gov (United States)

    Fike, Jennifer A.; Oyler-McCance, Sara J.; Zimmerman, Shawna J; Castoe, Todd A.

    2015-01-01

    Gunnison Sage-grouse are an obligate sagebrush species that has experienced significant population declines and has been proposed for listing under the U.S. Endangered Species Act. In order to examine levels of connectivity among Gunnison Sage-grouse leks, we identified 13 novel microsatellite loci though next-generation shotgun sequencing, and tested them on the closely related Greater Sage-grouse. The number of alleles per locus ranged from 2 to 12. No loci were found to be linked, although 2 loci revealed significant departures from Hardy–Weinberg equilibrium or evidence of null alleles. While these microsatellites were designed for Gunnison Sage-grouse, they also work well for Greater Sage-grouse and could be used for numerous genetic questions including landscape and population genetics.

  15. Effects of Fe and Mn deficiencies on the protein profiles of tomato (Solanum lycopersicum) xylem sap as revealed by shotgun analyses.

    Science.gov (United States)

    Ceballos-Laita, Laura; Gutierrez-Carbonell, Elain; Takahashi, Daisuke; Abadía, Anunciación; Uemura, Matsuo; Abadía, Javier; López-Millán, Ana Flor

    2018-01-06

    The aim of this work was to study the effects of Fe and Mn deficiencies on the xylem sap proteome of tomato using a shotgun proteomic approach, with the final goal of elucidating plant response mechanisms to these stresses. This approach yielded 643 proteins reliably identified and quantified with 70% of them predicted as secretory. Iron and Mn deficiencies caused statistically significant and biologically relevant abundance changes in 119 and 118 xylem sap proteins, respectively. In both deficiencies, metabolic pathways most affected were protein metabolism, stress/oxidoreductases and cell wall modifications. First, results suggest that Fe deficiency elicited more stress responses than Mn deficiency, based on the changes in oxidative and proteolytic enzymes. Second, both nutrient deficiencies affect the secondary cell wall metabolism, with changes in Fe deficiency occurring via peroxidase activity, and in Mn deficiency involving peroxidase, Cu-oxidase and fasciclin-like arabinogalactan proteins. Third, the primary cell wall metabolism was affected by both nutrient deficiencies, with changes following opposite directions as judged from the abundances of several glycoside-hydrolases with endo-glycolytic activities and pectin esterases. Fourth, signaling pathways via xylem involving CLE and/or lipids as well as changes in phosphorylation and N-glycosylation also play a role in the responses to these stresses. Biological significance In spite of being essential for the delivery of nutrients to the shoots, our knowledge of xylem responses to nutrient deficiencies is very limited. The present work applies a shotgun proteomic approach to unravel the effects of Fe and Mn deficiencies on the xylem sap proteome. Overall, Fe deficiency seems to elicit more stress in the xylem sap proteome than Mn deficiency, based on the changes measured in proteolytic and oxido-reductase proteins, whereas both nutrients exert modifications in the composition of the primary and secondary

  16. Culture-independent detection and characterisation of Mycobacterium tuberculosis and M. africanum in sputum samples using shotgun metagenomics on a benchtop sequencer

    Directory of Open Access Journals (Sweden)

    Emma L. Doughty

    2014-09-01

    Full Text Available Tuberculosis remains a major global health problem. Laboratory diagnostic methods that allow effective, early detection of cases are central to management of tuberculosis in the individual patient and in the community. Since the 1880s, laboratory diagnosis of tuberculosis has relied primarily on microscopy and culture. However, microscopy fails to provide species- or lineage-level identification and culture-based workflows for diagnosis of tuberculosis remain complex, expensive, slow, technically demanding and poorly able to handle mixed infections. We therefore explored the potential of shotgun metagenomics, sequencing of DNA from samples without culture or target-specific amplification or capture, to detect and characterise strains from the Mycobacterium tuberculosis complex in smear-positive sputum samples obtained from The Gambia in West Africa. Eight smear- and culture-positive sputum samples were investigated using a differential-lysis protocol followed by a kit-based DNA extraction method, with sequencing performed on a benchtop sequencing instrument, the Illumina MiSeq. The number of sequence reads in each sputum-derived metagenome ranged from 989,442 to 2,818,238. The proportion of reads in each metagenome mapping against the human genome ranged from 20% to 99%. We were able to detect sequences from the M. tuberculosis complex in all eight samples, with coverage of the H37Rv reference genome ranging from 0.002X to 0.7X. By analysing the distribution of large sequence polymorphisms (deletions and the locations of the insertion element IS6110 and single nucleotide polymorphisms (SNPs, we were able to assign seven of eight metagenome-derived genomes to a species and lineage within the M. tuberculosis complex. Two metagenome-derived mycobacterial genomes were assigned to M. africanum, a species largely confined to West Africa; the others that could be assigned belonged to lineages T, H or LAM within the clade of “modern” M. tuberculosis

  17. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    Science.gov (United States)

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  18. The Cytotoxicity Mechanism of 6-Shogaol-Treated HeLa Human Cervical Cancer Cells Revealed by Label-Free Shotgun Proteomics and Bioinformatics Analysis

    Directory of Open Access Journals (Sweden)

    Qun Liu

    2012-01-01

    Full Text Available Cervical cancer is one of the most common cancers among women in the world. 6-Shogaol is a natural compound isolated from the rhizome of ginger (Zingiber officinale. In this paper, we demonstrated that 6-shogaol induced apoptosis and G2/M phase arrest in human cervical cancer HeLa cells. Endoplasmic reticulum stress and mitochondrial pathway were involved in 6-shogaol-mediated apoptosis. Proteomic analysis based on label-free strategy by liquid chromatography chip quadrupole time-of-flight mass spectrometry was subsequently proposed to identify, in a non-target-biased manner, the molecular changes in cellular proteins in response to 6-shogaol treatment. A total of 287 proteins were differentially expressed in response to 24 h treatment with 15 μM 6-shogaol in HeLa cells. Significantly changed proteins were subjected to functional pathway analysis by multiple analyzing software. Ingenuity pathway analysis (IPA suggested that 14-3-3 signaling is a predominant canonical pathway involved in networks which may be significantly associated with the process of apoptosis and G2/M cell cycle arrest induced by 6-shogaol. In conclusion, this work developed an unbiased protein analysis strategy by shotgun proteomics and bioinformatics analysis. Data observed provide a comprehensive analysis of the 6-shogaol-treated HeLa cell proteome and reveal protein alterations that are associated with its anticancer mechanism.

  19. A Shotgun Proteomic Approach Reveals That Fe Deficiency Causes Marked Changes in the Protein Profiles of Plasma Membrane and Detergent-Resistant Microdomain Preparations from Beta vulgaris Roots.

    Science.gov (United States)

    Gutierrez-Carbonell, Elain; Takahashi, Daisuke; Lüthje, Sabine; González-Reyes, José Antonio; Mongrand, Sébastien; Contreras-Moreira, Bruno; Abadía, Anunciación; Uemura, Matsuo; Abadía, Javier; López-Millán, Ana Flor

    2016-08-05

    In the present study we have used label-free shotgun proteomic analysis to examine the effects of Fe deficiency on the protein profiles of highly pure sugar beet root plasma membrane (PM) preparations and detergent-resistant membranes (DRMs), the latter as an approach to study microdomains. Altogether, 545 proteins were detected, with 52 and 68 of them changing significantly with Fe deficiency in PM and DRM, respectively. Functional categorization of these proteins showed that signaling and general and vesicle-related transport accounted for approximately 50% of the differences in both PM and DRM, indicating that from a qualitative point of view changes induced by Fe deficiency are similar in both preparations. Results indicate that Fe deficiency has an impact in phosphorylation processes at the PM level and highlight the involvement of signaling proteins, especially those from the 14-3-3 family. Lipid profiling revealed Fe-deficiency-induced decreases in phosphatidic acid derivatives, which may impair vesicle formation, in agreement with the decreases measured in proteins related to intracellular trafficking and secretion. The modifications induced by Fe deficiency in the relative enrichment of proteins in DRMs revealed the existence of a group of cytoplasmic proteins that appears to be more attached to the PM in conditions of Fe deficiency.

  20. Retrospective Identification of Herpes Simplex 2 Virus-Associated Acute Liver Failure in an Immunocompetent Patient Detected Using Whole Transcriptome Shotgun Sequencing.

    Science.gov (United States)

    Ono, Atsushi; Hayes, C Nelson; Akamatsu, Sakura; Imamura, Michio; Aikata, Hiroshi; Chayama, Kazuaki

    2017-01-01

    Acute liver failure (ALF) is a severe condition in which liver function rapidly deteriorates in individuals without prior history of liver disease. While most cases result from acetaminophen overdose or viral hepatitis, in up to a third of patients, no clear cause can be identified. Liver transplantation has greatly reduced mortality among these patients, but 40% of patients recover without liver transplantation. Therefore, there is an urgent need for rapid determination of the etiology of acute liver failure. In this case report, we present a case of herpes simplex 2 virus- (HSV-) associated ALF in an immunocompetent patient. The patient recovered without LT, but the presence of HSV was not suspected at the time, precluding more effective treatment with acyclovir. To determine the etiology, stored blood samples were analyzed using whole transcriptome shotgun sequencing followed by mapping to a panel of viral reference sequences. The presence of HSV-DNA in blood samples at the time of admission was confirmed using real-time polymerase chain reaction, and, at the time of discharge, HSV-DNA levels had decreased by a factor of 10 6 . Conclusions. In ALF cases of undetermined etiology, uncommon causes should be considered, especially those for which an effective treatment is available.

  1. Retrospective Identification of Herpes Simplex 2 Virus-Associated Acute Liver Failure in an Immunocompetent Patient Detected Using Whole Transcriptome Shotgun Sequencing

    Directory of Open Access Journals (Sweden)

    Atsushi Ono

    2017-01-01

    Full Text Available Acute liver failure (ALF is a severe condition in which liver function rapidly deteriorates in individuals without prior history of liver disease. While most cases result from acetaminophen overdose or viral hepatitis, in up to a third of patients, no clear cause can be identified. Liver transplantation has greatly reduced mortality among these patients, but 40% of patients recover without liver transplantation. Therefore, there is an urgent need for rapid determination of the etiology of acute liver failure. In this case report, we present a case of herpes simplex 2 virus- (HSV- associated ALF in an immunocompetent patient. The patient recovered without LT, but the presence of HSV was not suspected at the time, precluding more effective treatment with acyclovir. To determine the etiology, stored blood samples were analyzed using whole transcriptome shotgun sequencing followed by mapping to a panel of viral reference sequences. The presence of HSV-DNA in blood samples at the time of admission was confirmed using real-time polymerase chain reaction, and, at the time of discharge, HSV-DNA levels had decreased by a factor of 106. Conclusions. In ALF cases of undetermined etiology, uncommon causes should be considered, especially those for which an effective treatment is available.

  2. RAPID AND AUTOMATED PROCESSING OF MALDI-FTICR/MS DATA FOR N-METABOLIC LABELING IN A SHOTGUN PROTEOMICS ANALYSIS.

    Science.gov (United States)

    Jing, Li; Amster, I Jonathan

    2009-10-15

    Offline high performance liquid chromatography combined with matrix assisted laser desorption and Fourier transform ion cyclotron resonance mass spectrometry (HPLC-MALDI-FTICR/MS) provides the means to rapidly analyze complex mixtures of peptides, such as those produced by proteolytic digestion of a proteome. This method is particularly useful for making quantitative measurements of changes in protein expression by using (15)N-metabolic labeling. Proteolytic digestion of combined labeled and unlabeled proteomes produces complex mixtures that with many mass overlaps when analyzed by HPLC-MALDI-FTICR/MS. A significant challenge to data analysis is the matching of pairs of peaks which represent an unlabeled peptide and its labeled counterpart. We have developed an algorithm and incorporated it into a compute program which significantly accelerates the interpretation of (15)N metabolic labeling data by automating the process of identifying unlabeled/labeled peak pairs. The algorithm takes advantage of the high resolution and mass accuracy of FTICR mass spectrometry. The algorithm is shown to be able to successfully identify the (15)N/(14)N peptide pairs and calculate peptide relative abundance ratios in highly complex mixtures from the proteolytic digest of a whole organism protein extract.

  3. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  4. Quantitative metagenomics reveals unique gut microbiome biomarkers in ankylosing spondylitis.

    Science.gov (United States)

    Wen, Chengping; Zheng, Zhijun; Shao, Tiejuan; Liu, Lin; Xie, Zhijun; Le Chatelier, Emmanuelle; He, Zhixing; Zhong, Wendi; Fan, Yongsheng; Zhang, Linshuang; Li, Haichang; Wu, Chunyan; Hu, Changfeng; Xu, Qian; Zhou, Jia; Cai, Shunfeng; Wang, Dawei; Huang, Yun; Breban, Maxime; Qin, Nan; Ehrlich, Stanislav Dusko

    2017-07-27

    The assessment and characterization of the gut microbiome has become a focus of research in the area of human autoimmune diseases. Ankylosing spondylitis is an inflammatory autoimmune disease and evidence showed that ankylosing spondylitis may be a microbiome-driven disease. To investigate the relationship between the gut microbiome and ankylosing spondylitis, a quantitative metagenomics study based on deep shotgun sequencing was performed, using gut microbial DNA from 211 Chinese individuals. A total of 23,709 genes and 12 metagenomic species were shown to be differentially abundant between ankylosing spondylitis patients and healthy controls. Patients were characterized by a form of gut microbial dysbiosis that is more prominent than previously reported cases with inflammatory bowel disease. Specifically, the ankylosing spondylitis patients demonstrated increases in the abundance of Prevotella melaninogenica, Prevotella copri, and Prevotella sp. C561 and decreases in Bacteroides spp. It is noteworthy that the Bifidobacterium genus, which is commonly used in probiotics, accumulated in the ankylosing spondylitis patients. Diagnostic algorithms were established using a subset of these gut microbial biomarkers. Alterations of the gut microbiome are associated with development of ankylosing spondylitis. Our data suggest biomarkers identified in this study might participate in the pathogenesis or development process of ankylosing spondylitis, providing new leads for the development of new diagnostic tools and potential treatments.

  5. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  6. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  7. Shotgun proteomics reveals specific modulated protein patterns in tears of patients with primary open angle glaucoma naïve to therapy.

    Science.gov (United States)

    Pieragostino, Damiana; Agnifili, Luca; Fasanella, Vincenzo; D'Aguanno, Simona; Mastropasqua, Rodolfo; Di Ilio, Carmine; Sacchetta, Paolo; Urbani, Andrea; Del Boccio, Piero

    2013-06-01

    Primary open angle glaucoma (POAG) is one of the main causes of irreversible blindness worldwide. The pathogenesis of POAG is still unclear. Alteration and sclerosis of trabecular meshwork with changes in aqueous humor molecular composition seem to play the key role. Increased intraocular pressure is widely known to be the main risk factor for the onset and progression of the disease. Unfortunately, the early diagnosis of POAG still remains the main challenge. In order to provide insight into the patho-physiology of glaucoma, here we report a shotgun proteomics approach to tears of patients with POAG naïve to therapy. Our proteomics results showed 27 differential tear proteins in POAG vs. CTRL comparison (25 up regulated proteins in the POAG group and two unique proteins in the CTRL group), 16 of which were associated with inflammatory response, free radical scavenging, cell-to-cell signaling and interaction. Overall the protein modulation shown in POAG tears proves the involvement of biochemical networks linked to inflammation. Among all regulated proteins, a sub-group of 12 up-regulated proteins in naïve POAG patients were found to be down-regulated in medically controlled POAG patients treated with prostanoid analogues (PGA), as reported in our previous work (i.e., lipocalin-1, lysozyme C, lactotransferrin, proline-rich-protein 4, prolactin-inducible protein, zinc-alpha-2-glycoprotein, polymeric immunoglobulin receptor, cystatin S, Ig kappa chain C region, Ig alpha-2 chain C region, immunoglobulin J chain, Ig alpha-1 chain C region). In summary, our findings indicate that the POAG tears protein expression is a mixture of increased inflammatory proteins that could be potential biomarkers of the disease, and their regulation may be involved in the mechanism by which PGA are able to decrease the intraocular pressure in glaucoma patients.

  8. Membrane phospholipid composition may contribute to exceptional longevity of the naked mole-rat (Heterocephalus glaber): a comparative study using shotgun lipidomics.

    Science.gov (United States)

    Mitchell, Todd W; Buffenstein, Rochelle; Hulbert, A J

    2007-11-01

    Phospholipids containing highly polyunsaturated fatty acids are particularly prone to peroxidation and membrane composition may therefore influence longevity. Phospholipid molecules, in particular those containing docosahexaenoic acid (DHA), from the skeletal muscle, heart, liver and liver mitochondria were identified and quantified using mass-spectrometry shotgun lipidomics in two similar-sized rodents that show an approximately 9-fold difference in maximum lifespan. The naked mole rat is the longest-living rodent known with a maximum lifespan of >28 years. Total phospholipid distribution is similar in tissues of both species; DHA is only found in phosphatidylcholines (PC), phosphatidylethanolamines (PE) and phosphatidylserines (PS), and DHA is relatively more concentrated in PE than PC. Naked mole-rats have fewer molecular species of both PC and PE than do mice. DHA-containing phospholipids represent 27-57% of all phospholipids in mice but only 2-6% in naked mole-rats. Furthermore, while mice have small amounts of di-polyunsaturated PC and PE, these are lacking in naked mole-rats. Vinyl ether-linked phospholipids (plasmalogens) are higher in naked mole-rat tissues than in mice. The lower level of DHA-containing phospholipids suggests a lower susceptibility to peroxidative damage in membranes of naked mole-rats compared to mice. Whereas the high level of plasmalogens might enhance membrane antioxidant protection in naked mole-rats compared to mice. Both characteristics possibly contribute to the exceptional longevity of naked mole-rats and may indicate a special role for peroxisomes in this extended longevity.

  9. Shot-gun proteome and transcriptome mapping of the jujube floral organ and identification of a pollen-specific S-locus F-box gene

    Directory of Open Access Journals (Sweden)

    Ruihong Chen

    2017-07-01

    Full Text Available The flower is a plant reproductive organ that forms part of the fruit produced as the flowering season ends. While the number and identity of proteins expressed in a jujube (Ziziphus jujuba Mill. flower is currently unknown, integrative proteomic and transcriptomic analyses provide a systematic strategy of characterizing the floral biology of plants. We conducted a shotgun proteomic analysis on jujube flowers by using a filter-aided sample preparation tryptic digestion, followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS. In addition, transcriptomics analyses were performed on HiSeq2000 sequencers. In total, 7,853 proteins were identified accounting for nearly 30% of the ‘Junzao’ gene models (27,443. Genes identified in proteome generally showed higher RPKM (reads per kilobase per million mapped reads values than undetected genes. Gene ontology categories showed that ribosomes and intracellular organelles were the most dominant classes and accounted for 17.0% and 14.0% of the proteome mass, respectively. The top-ranking proteins with iBAQ >1010 included non-specific lipid transfer proteins, histones, actin-related proteins, fructose-bisphosphate aldolase, Bet v I type allergens, etc. In addition, we identified one pollen-specificity S-locus F-box-like gene located on the same chromosome as the S-RNase gene. Both of these may activate the behaviour of gametophyte self-incompatibility in jujube. These results reflected the protein profile features of jujube flowers and contributes new information important to the jujube breeding system.

  10. Deep Illumina-based shotgun sequencing reveals dietary effects on the structure and function of the fecal microbiome of growing kittens.

    Directory of Open Access Journals (Sweden)

    Oliver Deusch

    Full Text Available Previously, we demonstrated that dietary protein:carbohydrate ratio dramatically affects the fecal microbial taxonomic structure of kittens using targeted 16S gene sequencing. The present study, using the same fecal samples, applied deep Illumina shotgun sequencing to identify the diet-associated functional potential and analyze taxonomic changes of the feline fecal microbiome.Fecal samples from kittens fed one of two diets differing in protein and carbohydrate content (high-protein, low-carbohydrate, HPLC; and moderate-protein, moderate-carbohydrate, MPMC were collected at 8, 12 and 16 weeks of age (n = 6 per group. A total of 345.3 gigabases of sequence were generated from 36 samples, with 99.75% of annotated sequences identified as bacterial. At the genus level, 26% and 39% of reads were annotated for HPLC- and MPMC-fed kittens, with HPLC-fed cats showing greater species richness and microbial diversity. Two phyla, ten families and fifteen genera were responsible for more than 80% of the sequences at each taxonomic level for both diet groups, consistent with the previous taxonomic study. Significantly different abundances between diet groups were observed for 324 genera (56% of all genera identified demonstrating widespread diet-induced changes in microbial taxonomic structure. Diversity was not affected over time. Functional analysis identified 2,013 putative enzyme function groups were different (p<0.000007 between the two dietary groups and were associated to 194 pathways, which formed five discrete clusters based on average relative abundance. Of those, ten contained more (p<0.022 enzyme functions with significant diet effects than expected by chance. Six pathways were related to amino acid biosynthesis and metabolism linking changes in dietary protein with functional differences of the gut microbiome.These data indicate that feline feces-derived microbiomes have large structural and functional differences relating to the dietary

  11. Effects of 28 days of resistance exercise and consuming a commercially available pre-workout supplement, NO-Shotgun®, on body composition, muscle strength and mass, markers of satellite cell activation, and clinical safety markers in males

    Directory of Open Access Journals (Sweden)

    Leutholtz Brian

    2009-08-01

    Full Text Available Abstract Purpose This study determined the effects of 28 days of heavy resistance exercise combined with the nutritional supplement, NO-Shotgun®, on body composition, muscle strength and mass, markers of satellite cell activation, and clinical safety markers. Methods Eighteen non-resistance-trained males participated in a resistance training program (3 × 10-RM 4 times/wk for 28 days while also ingesting 27 g/day of placebo (PL or NO-Shotgun® (NO 30 min prior to exercise. Data were analyzed with separate 2 × 2 ANOVA and t-tests (p Results Total body mass was increased in both groups (p = 0.001, but without any significant increases in total body water (p = 0.77. No significant changes occurred with fat mass (p = 0.62; however fat-free mass did increase with training (p = 0.001, and NO was significantly greater than PL (p = 0.001. Bench press strength for NO was significantly greater than PL (p = 0.003. Myofibrillar protein increased with training (p = 0.001, with NO being significantly greater than PL (p = 0.019. Serum IGF-1 (p = 0.046 and HGF (p = 0.06 were significantly increased with training and for NO HGF was greater than PL (p = 0.002. Muscle phosphorylated c-met was increased with training for both groups (p = 0.019. Total DNA was increased in both groups (p = 0.006, while NO was significantly greater than PL (p = 0.038. For DNA/protein, PL was decreased and NO was not changed (p = 0.014. All of the myogenic regulatory factors were increased with training; however, NO was shown to be significantly greater than PL for Myo-D (p = 0.008 and MRF-4 (p = 0.022. No significant differences were located for any of the whole blood and serum clinical chemistry markers (p > 0.05. Conclusion When combined with heavy resistance training for 28 days, NO-Shotgun® is not associated with any negative side effects, nor does it abnormally impact any of the clinical chemistry markers. Rather, NO-Shotgun® effectively increases muscle strength and mass

  12. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  13. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  14. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  15. Re-annotation of the physical map of Glycine max for polyploid-like regions by BAC end sequence driven whole genome shotgun read assembly

    Directory of Open Access Journals (Sweden)

    Shultz Jeffry

    2008-07-01

    Full Text Available Abstract Background Many of the world's most important food crops have either polyploid genomes or homeologous regions derived from segmental shuffling following polyploid formation. The soybean (Glycine max genome has been shown to be composed of approximately four thousand short interspersed homeologous regions with 1, 2 or 4 copies per haploid genome by RFLP analysis, microsatellite anchors to BACs and by contigs formed from BAC fingerprints. Despite these similar regions,, the genome has been sequenced by whole genome shotgun sequence (WGS. Here the aim was to use BAC end sequences (BES derived from three minimum tile paths (MTP to examine the extent and homogeneity of polyploid-like regions within contigs and the extent of correlation between the polyploid-like regions inferred from fingerprinting and the polyploid-like sequences inferred from WGS matches. Results Results show that when sequence divergence was 1–10%, the copy number of homeologous regions could be identified from sequence variation in WGS reads overlapping BES. Homeolog sequence variants (HSVs were single nucleotide polymorphisms (SNPs; 89% and single nucleotide indels (SNIs 10%. Larger indels were rare but present (1%. Simulations that had predicted fingerprints of homeologous regions could be separated when divergence exceeded 2% were shown to be false. We show that a 5–10% sequence divergence is necessary to separate homeologs by fingerprinting. BES compared to WGS traces showed polyploid-like regions with less than 1% sequence divergence exist at 2.3% of the locations assayed. Conclusion The use of HSVs like SNPs and SNIs to characterize BACs wil improve contig building methods. The implications for bioinformatic and functional annotation of polyploid and paleopolyploid genomes show that a combined approach of BAC fingerprint based physical maps, WGS sequence and HSV-based partitioning of BAC clones from homeologous regions to separate contigs will allow reliable de

  16. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  17. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  19. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  20. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  1. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  2. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  3. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  4. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  5. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  6. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  7. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  8. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  9. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  10. Quantitative cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Thelen, M.; Dueber, C.; Wolff, P.; Erbel, R.; Hoffmann, T.

    1985-06-01

    The scope and limitations of quantitative cardiac CT have been evaluated in a series of experimental and clinical studies. The left ventricular muscle mass was estimated by computed tomography in 19 dogs (using volumetric methods, measurements in two axes and planes and reference volume). There was good correlation with anatomical findings. The enddiastolic volume of the left ventricle was estimated in 22 patients with cardiomyopathies; using angiography as a reference, CT led to systematic under-estimation. It is also shown that ECG-triggered magnetic resonance tomography results in improved visualisation and may be expected to improve measurements of cardiac morphology.

  11. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  12. Quantitative proteomics reveals the kinetics of trypsin-catalyzed protein digestion.

    Science.gov (United States)

    Pan, Yanbo; Cheng, Kai; Mao, Jiawei; Liu, Fangjie; Liu, Jing; Ye, Mingliang; Zou, Hanfa

    2014-10-01

    Trypsin is the popular protease to digest proteins into peptides in shotgun proteomics, but few studies have attempted to systematically investigate the kinetics of trypsin-catalyzed protein digestion in proteome samples. In this study, we applied quantitative proteomics via triplex stable isotope dimethyl labeling to investigate the kinetics of trypsin-catalyzed cleavage. It was found that trypsin cleaves the C-terminal to lysine (K) and arginine (R) residues with higher rates for R. And the cleavage sites surrounded by neutral residues could be quickly cut, while those with neighboring charged residues (D/E/K/R) or proline residue (P) could be slowly cut. In a proteome sample, a huge number of proteins with different physical chemical properties coexists. If any type of protein could be preferably digested, then limited digestion could be applied to reduce the sample complexity. However, we found that protein abundance and other physicochemical properties, such as molecular weight (Mw), grand average of hydropathicity (GRAVY), aliphatic index, and isoelectric point (pI) have no notable correlation with digestion priority of proteins.

  13. Quantitative proteomic analysis of wheat cultivars with differing drought stress tolerance

    Directory of Open Access Journals (Sweden)

    Kristina L Ford

    2011-09-01

    Full Text Available Using a series of multiplexed experiments we studied the quantitative changes in protein abundance of three Australian bread wheat cultivars (Triticum aestivum L. in response to a drought stress. Three cultivars differing in their ability to maintain grain yield during drought, Kukri (intolerant, Excalibur (tolerant and RAC875 (tolerant, were grown in the glasshouse with cyclic drought treatment that mimicked conditions in the field. Proteins were isolated from leaves of mature plants and isobaric tags were used to follow changes in the relative protein abundance of 159 proteins. This is the first shotgun proteomics study in wheat, providing important insights into protein responses to drought as well as identifying the largest number of wheat proteins (1,299 in a single study. The changes in the three cultivars at the different time points reflected their differing physiological responses to drought, with the two drought tolerant varieties (Excalibur and RAC875 differing in their protein responses. Excalibur lacked significant changes in proteins during the initial onset of the water deficit in contrast to RAC875 that had a large number of significant changes. All three cultivars had changes consistent with an increase in oxidative stress metabolism and ROS scavenging capacity seen through increases in superoxide dismutases and catalases as well as ROS avoidance through the decreases in proteins involved in photosynthesis and the Calvin cycle.

  14. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  15. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  16. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  17. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  18. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  19. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  20. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  1. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  2. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  3. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  4. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  5. Fine mapping of powdery mildew resistance genes PmTb7A.1 and PmTb7A.2 in Triticum boeoticum (Boiss.) using the shotgun sequence assembly of chromosome 7AL.

    Science.gov (United States)

    Chhuneja, Parveen; Yadav, Bharat; Stirnweis, Daniel; Hurni, Severine; Kaur, Satinder; Elkot, Ahmed Fawzy; Keller, Beat; Wicker, Thomas; Sehgal, Sunish; Gill, Bikram S; Singh, Kuldeep

    2015-10-01

    A novel powdery mildew resistance gene and a new allele of Pm1 were identified and fine mapped. DNA markers suitable for marker-assisted selection have been identified. Powdery mildew caused by Blumeria graminis is one of the most important foliar diseases of wheat and causes significant yield losses worldwide. Diploid A genome species are an important genetic resource for disease resistance genes. Two powdery mildew resistance genes, identified in Triticum boeoticum (A(b)A(b)) accession pau5088, PmTb7A.1 and PmTb7A.2 were mapped on chromosome 7AL. In the present study, shotgun sequence assembly data for chromosome 7AL were utilised for fine mapping of these Pm resistance genes. Forty SSR, 73 resistance gene analogue-based sequence-tagged sites (RGA-STS) and 36 single nucleotide polymorphism markers were designed for fine mapping of PmTb7A.1 and PmTb7A.2. Twenty-one RGA-STS, 8 SSR and 13 SNP markers were mapped to 7AL. RGA-STS markers Ta7AL-4556232 and 7AL-4426363 were linked to the PmTb7A.1 and PmTb7A.2, at a genetic distance of 0.6 and 6.0 cM, respectively. The present investigation established that PmTb7A.1 is a new powdery mildew resistance gene that confers resistance to a broad range of Bgt isolates, whereas PmTb7A.2 most probably is a new allele of Pm1 based on chromosomal location and screening with Bgt isolates showing differential reaction on lines with different Pm1 alleles. The markers identified to be linked to the two Pm resistance genes are robust and can be used for marker-assisted introgression of these genes to hexaploid wheat.

  6. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  7. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  8. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  9. Quantitative Proteomics Reveals Temporal Proteomic Changes in Signaling Pathways during BV2 Mouse Microglial Cell Activation.

    Science.gov (United States)

    Woo, Jongmin; Han, Dohyun; Wang, Joseph Injae; Park, Joonho; Kim, Hyunsoo; Kim, Youngsoo

    2017-09-01

    The development of systematic proteomic quantification techniques in systems biology research has enabled one to perform an in-depth analysis of cellular systems. We have developed a systematic proteomic approach that encompasses the spectrum from global to targeted analysis on a single platform. We have applied this technique to an activated microglia cell system to examine changes in the intracellular and extracellular proteomes. Microglia become activated when their homeostatic microenvironment is disrupted. There are varying degrees of microglial activation, and we chose to focus on the proinflammatory reactive state that is induced by exposure to such stimuli as lipopolysaccharide (LPS) and interferon-gamma (IFN-γ). Using an improved shotgun proteomics approach, we identified 5497 proteins in the whole-cell proteome and 4938 proteins in the secretome that were associated with the activation of BV2 mouse microglia by LPS or IFN-γ. Of the differentially expressed proteins in stimulated microglia, we classified pathways that were related to immune-inflammatory responses and metabolism. Our label-free parallel reaction monitoring (PRM) approach made it possible to comprehensively measure the hyper-multiplex quantitative value of each protein by high-resolution mass spectrometry. Over 450 peptides that corresponded to pathway proteins and direct or indirect interactors via the STRING database were quantified by label-free PRM in a single run. Moreover, we performed a longitudinal quantification of secreted proteins during microglial activation, in which neurotoxic molecules that mediate neuronal cell loss in the brain are released. These data suggest that latent pathways that are associated with neurodegenerative diseases can be discovered by constructing and analyzing a pathway network model of proteins. Furthermore, this systematic quantification platform has tremendous potential for applications in large-scale targeted analyses. The proteomics data for

  10. Quantitative assessment of chemical artefacts produced by propionylation of histones prior to mass spectrometry analysis.

    Science.gov (United States)

    Soldi, Monica; Cuomo, Alessandro; Bonaldi, Tiziana

    2016-07-01

    Histone PTMs play a crucial role in regulating chromatin structure and function, with impact on gene expression. MS is nowadays widely applied to study histone PTMs systematically. Because histones are rich in arginine and lysine, classical shot-gun approaches based on trypsin digestion are typically not employed for histone modifications mapping. Instead, different protocols of chemical derivatization of lysines in combination with trypsin have been implemented to obtain "Arg-C like" digestion products that are more suitable for LC-MS/MS analysis. Although widespread, these strategies have been recently described to cause various side reactions that result in chemical modifications prone to be misinterpreted as native histone marks. These artefacts can also interfere with the quantification process, causing errors in histone PTMs profiling. The work of Paternoster V. et al. is a quantitative assessment of methyl-esterification and other side reactions occurring on histones after chemical derivatization of lysines with propionic anhydride [Proteomics 2016, 16, 2059-2063]. The authors estimate the effect of different solvents, incubation times, and pH on the extent of these side reactions. The results collected indicate that the replacement of methanol with isopropanol or ACN not only blocks methyl-esterification, but also significantly reduces other undesired unspecific reactions. Carefully titrating the pH after propionic anhydride addition is another way to keep methyl-esterification under control. Overall, the authors describe a set of experimental conditions that allow reducing the generation of various artefacts during histone propionylation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  12. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  13. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  14. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  15. Quantitative nature of overexpression experiments

    Science.gov (United States)

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  16. Shotgun microbial profiling of fossil remains

    DEFF Research Database (Denmark)

    Der Sarkissian, Clio; Ermini, Luca; Jónsson, Hákon

    2014-01-01

    the specimen of interest, but instead reflect environmental organisms that colonized the specimen after death. Here, we characterize the microbial diversity recovered from seven c. 200- to 13 000-year-old horse bones collected from northern Siberia. We use a robust, taxonomy-based assignment approach...... to identify the microorganisms present in ancient DNA extracts and quantify their relative abundance. Our results suggest that molecular preservation niches exist within ancient samples that can potentially be used to characterize the environments from which the remains are recovered. In addition, microbial...... community profiling of the seven specimens revealed site-specific environmental signatures. These microbial communities appear to comprise mainly organisms that colonized the fossils recently. Our approach significantly extends the amount of useful data that can be recovered from ancient specimens using...

  17. Genome shotgun sequencing and development of microsatellite ...

    African Journals Online (AJOL)

    Analysis of the gerbera genome DNA ('Raon') general library showed that sequences of (AT), (AG), (AAG) and (AAT) repeats appeared most often, whereas (AC), (AAC) and (ACC) were the least frequent. Primer pairs were designed for 80 loci. Only eight primer pairs produced reproducible polymorphic bands in the 28 ...

  18. Genome shotgun sequencing and development of microsatellite ...

    African Journals Online (AJOL)

    ADP

    2012-04-10

    Apr 10, 2012 ... useful for investigating genetic diversity and differentiation in gerbera. Key words: ... However, this method had a disadvantage: it could not .... PCR product. PCR was ..... advantages, SSR markers had not been developed or ...

  19. Quantitative and Selective Analysis of Feline Growth Related Proteins Using Parallel Reaction Monitoring High Resolution Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Mårten Sundberg

    Full Text Available Today immunoassays are widely used in veterinary medicine, but lack of species specific assays often necessitates the use of assays developed for human applications. Mass spectrometry (MS is an attractive alternative due to high specificity and versatility, allowing for species-independent analysis. Targeted MS-based quantification methods are valuable complements to large scale shotgun analysis. A method referred to as parallel reaction monitoring (PRM, implemented on Orbitrap MS, has lately been presented as an excellent alternative to more traditional selected reaction monitoring/multiple reaction monitoring (SRM/MRM methods. The insulin-like growth factor (IGF-system is not well described in the cat but there are indications of important differences between cats and humans. In feline medicine IGF-I is mainly analyzed for diagnosis of growth hormone disorders but also for research, while the other proteins in the IGF-system are not routinely analyzed within clinical practice. Here, a PRM method for quantification of IGF-I, IGF-II, IGF binding protein (BP -3 and IGFBP-5 in feline serum is presented. Selective quantification was supported by the use of a newly launched internal standard named QPrEST™. Homology searches demonstrated the possibility to use this standard of human origin for quantification of the targeted feline proteins. Excellent quantitative sensitivity at the attomol/μL (pM level and selectivity were obtained. As the presented approach is very generic we show that high resolution mass spectrometry in combination with PRM and QPrEST™ internal standards is a versatile tool for protein quantitation across multispecies.

  20. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  1. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  2. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  3. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  4. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  5. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  6. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  7. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  8. Quantitative (real-time) PCR

    International Nuclear Information System (INIS)

    Denman, S.E.; McSweeney, C.S.

    2005-01-01

    Many nucleic acid-based probe and PCR assays have been developed for the detection tracking of specific microbes within the rumen ecosystem. Conventional PCR assays detect PCR products at the end stage of each PCR reaction, where exponential amplification is no longer being achieved. This approach can result in different end product (amplicon) quantities being generated. In contrast, using quantitative, or real-time PCR, quantification of the amplicon is performed not at the end of the reaction, but rather during exponential amplification, where theoretically each cycle will result in a doubling of product being created. For real-time PCR, the cycle at which fluorescence is deemed to be detectable above the background during the exponential phase is termed the cycle threshold (Ct). The Ct values obtained are then used for quantitation, which will be discussed later

  9. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  10. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  11. Qualitative discussion of quantitative radiography

    International Nuclear Information System (INIS)

    Berger, H.; Motz, J.W.

    1975-01-01

    Since radiography yields an image that can be easily related to the tested object, it is superior to many nondestructive testing techniques in revealing the size, shape, and location of certain types of discontinuities. The discussion is limited to a description of the radiographic process, examination of some of the quantitative aspects of radiography, and an outline of some of the new ideas emerging in radiography. The advantages of monoenergetic x-ray radiography and neutron radiography are noted

  12. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  13. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  14. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  15. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  16. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  17. Experimental design and data-analysis in label-free quantitative LC/MS proteomics: A tutorial with MSqRob.

    Science.gov (United States)

    Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven

    2018-01-16

    Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  19. Quantitative Trait Loci in Inbred Lines

    NARCIS (Netherlands)

    Jansen, R.C.

    2001-01-01

    Quantitative traits result from the influence of multiple genes (quantitative trait loci) and environmental factors. Detecting and mapping the individual genes underlying such 'complex' traits is a difficult task. Fortunately, populations obtained from crosses between inbred lines are relatively

  20. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  1. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  2. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  3. Comprehensive and quantitative profiling of lipid species in human milk, cow milk and a phospholipid-enriched milk formula by GC and MS/MSALL.

    Science.gov (United States)

    Sokol, Elena; Ulven, Trond; Færgeman, Nils J; Ejsing, Christer S

    2015-06-01

    Here we present a workflow for in-depth analysis of milk lipids that combines gas chromatography (GC) for fatty acid (FA) profiling and a shotgun lipidomics routine termed MS/MS ALL for structural characterization of molecular lipid species. To evaluate the performance of the workflow we performed a comparative lipid analysis of human milk, cow milk, and Lacprodan® PL-20, a phospholipid-enriched milk protein concentrate for infant formula. The GC analysis showed that human milk and Lacprodan have a similar FA profile with higher levels of unsaturated FAs as compared to cow milk. In-depth lipidomic analysis by MS/MS ALL revealed that each type of milk sample comprised distinct composition of molecular lipid species. Lipid class composition showed that the human and cow milk contain a higher proportion of triacylglycerols (TAGs) as compared to Lacprodan. Notably, the MS/MS ALL analysis demonstrated that the similar FA profile of human milk and Lacprodan determined by GC analysis is attributed to the composition of individual TAG species in human milk and glycerophospholipid species in Lacprodan. Moreover, the analysis of TAG molecules in Lacprodan and cow milk showed a high proportion of short-chain FAs that could not be monitored by GC analysis. The results presented here show that complementary GC and MS/MS ALL analysis is a powerful approach for characterization of molecular lipid species in milk and milk products. : Milk lipid analysis is routinely performed using gas chromatography. This method reports the total fatty acid composition of all milk lipids, but provides no structural or quantitative information about individual lipid molecules in milk or milk products. Here we present a workflow that integrates gas chromatography for fatty acid profiling and a shotgun lipidomics routine termed MS/MS ALL for structural analysis and quantification of molecular lipid species. We demonstrate the efficacy of this complementary workflow by a comparative analysis of

  4. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  5. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  6. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  7. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  8. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  9. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  10. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  11. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  12. Quantitative proteomic analysis of cabernet sauvignon grape cells exposed to thermal stresses reveals alterations in sugar and phenylpropanoid metabolism.

    Science.gov (United States)

    George, Iniga S; Pascovici, Dana; Mirzaei, Mehdi; Haynes, Paul A

    2015-09-01

    Grapes (Vitis vinifera) are a valuable fruit crop and wine production is a major industry. Global warming and expanded range of cultivation will expose grapes to more temperature stresses in future. Our study investigated protein level responses to abiotic stresses, with particular reference to proteomic changes induced by the impact of four different temperature stress regimes, including both hot and cold temperatures, on cultured grape cells. Cabernet Sauvignon cell suspension cultures grown at 26°C were subjected to 14 h of exposure to 34 and 42°C for heat stress, and 18 and 10°C for cold stress. Cells from the five temperatures were harvested in biological triplicates and label-free quantitative shotgun proteomic analysis was performed. A total of 2042 non-redundant proteins were identified from the five temperature points. Fifty-five proteins were only detected in extreme heat stress conditions (42°C) and 53 proteins were only detected at extreme cold stress conditions (10°C). Gene Ontology (GO) annotations of differentially expressed proteins provided insights into the metabolic pathways that are involved in temperature stress in grape cells. Sugar metabolism displayed switching between alternative and classical pathways during temperature stresses. Additionally, nine proteins involved in the phenylpropanoid pathway were greatly increased in abundance at extreme cold stress, and were thus found to be cold-responsive proteins. All MS data have been deposited in the ProteomeXchange with identifier PXD000977 (http://proteomecentral.proteomexchange.org/dataset/PXD000977). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Quantitative proteomic analysis of human testis reveals system-wide molecular and cellular pathways associated with non-obstructive azoospermia.

    Science.gov (United States)

    Alikhani, Mehdi; Mirzaei, Mehdi; Sabbaghian, Marjan; Parsamatin, Pouria; Karamzadeh, Razieh; Adib, Samane; Sodeifi, Niloofar; Gilani, Mohammad Ali Sadighi; Zabet-Moghaddam, Masoud; Parker, Lindsay; Wu, Yunqi; Gupta, Vivek; Haynes, Paul A; Gourabi, Hamid; Baharvand, Hossein; Salekdeh, Ghasem Hosseini

    2017-06-06

    Male infertility accounts for half of the infertility problems experienced by couples. Azoospermia, having no measurable level of sperm in seminal fluid, is one of the known conditions resulting in male infertility. In order to elucidate the complex molecular mechanisms causing male azoospermia, label-free quantitative shotgun proteomics was carried out on testicular tissue specimens from patients with obstructive azoospermia and non-obstructive azoospermia, including maturation arrest (MA) and Sertoli cell only syndrome (SCOS). The abundance of 520 proteins was significantly changed across three groups of samples. We were able to identify several functional biological pathways enriched in azoospermia samples and confirm selected differentially abundant proteins, using multiple histological methods. The results revealed that cell cycle and proteolysis, and RNA splicing were the most significant biological processes impaired by the substantial suppression of proteins related to the aforementioned categories in SCOS tissues. In the MA patient testes, generation of precursor metabolites and energy as well as oxidation-reduction were the most significantly altered processes. Novel candidate proteins identified in this study include key transcription factors, many of which have not previously been shown to be associated with azoospermia. Our findings can provide substantial insights into the molecular regulation of spermatogenesis and human reproduction. The obtained data showed a drastic suppression of proteins involved in spliceosome, cell cycle and proteasome proteins, as well as energy and metabolic production in Sertoli cell only syndrome testis tissue, and to a lesser extent in maturation arrest samples. Moreover, we identified new transcription factors that are highly down-regulated in SCOS and MA patients, thus helping to understand the molecular complexity of spermatogenesis in male infertility. Our findings provide novel candidate protein targets associated

  14. Evaluation of Rice Resistance to Southern Rice Black-Streaked Dwarf Virus and Rice Ragged Stunt Virus through Combined Field Tests, Quantitative Real-Time PCR, and Proteome Analysis.

    Science.gov (United States)

    Wang, Zhenchao; Yu, Lu; Jin, Linhong; Wang, Wenli; Zhao, Qi; Ran, Longlu; Li, Xiangyang; Chen, Zhuo; Guo, Rong; Wei, Yongtian; Yang, Zhongcheng; Liu, Enlong; Hu, Deyu; Song, Baoan

    2017-02-22

    Diseases caused by southern rice black-streaked dwarf virus (SRBSDV) and rice ragged stunt virus (RRSV) considerably decrease grain yield. Therefore, determining rice cultivars with high resistance to SRBSDV and RRSV is necessary. In this study, rice cultivars with high resistance to SRBSDV and RRSV were evaluated through field trials in Shidian and Mangshi county, Yunnan province, China. SYBR Green I-based quantitative real-time polymerase chain reaction (qRT-PCR) analysis was used to quantitatively detect virus gene expression levels in different rice varieties. The following parameters were applied to evaluate rice resistance: acre yield (A.Y.), incidence of infected plants (I.I.P.), virus load (V.L.), disease index (D.I.), and insect quantity (I.Q.) per 100 clusters. Zhongzheyou1 (Z1) and Liangyou2186 (L2186) were considered the most suitable varieties with integrated higher A.Y., lower I.I.P., V.L., D.I. and I.Q. In order to investigate the mechanism of rice resistance, comparative label-free shotgun liquid chromatography tandem-mass spectrometry (LC-MS/MS) proteomic approaches were applied to comprehensively describe the proteomics of rice varieties' SRBSDV tolerance. Systemic acquired resistance (SAR)-related proteins in Z1 and L2186 may result in the superior resistance of these varieties compared with Fengyouxiangzhan (FYXZ).

  15. Quantitative evaluation of dermatological antiseptics.

    Science.gov (United States)

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  16. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  17. Quantitative Activities for Introductory Astronomy

    Science.gov (United States)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  18. Quantitative patterns in drone wars

    Science.gov (United States)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  19. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  20. Quantitative variation in natural populations

    International Nuclear Information System (INIS)

    Parsons, P.A.

    1975-01-01

    Quantitative variation is considered in natural populations using Drosophila as the example. A knowledge of such variation enables its rapid exploitation in directional selection experiments as shown for scutellar chaeta number. Where evidence has been obtained, genetic architectures are in qualitative agreement with Mather's concept of balance for traits under stabilizing selection. Additive genetic control is found for acute environmental stresses, but not for less acute stresses as shown by exposure to 60 Co-γ rays. D. simulans probably has a narrower ecological niche than its sibling species D. melanogaster associated with lower genetic heterogeneity. One specific environmental stress to which D. simulans is sensitive in nature is ethyl alcohol as shown by winery data. (U.S.)

  1. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  2. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  3. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  4. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  5. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  6. Quantitative organ visualization using SPECT

    International Nuclear Information System (INIS)

    Kircos, L.T.; Carey, J.E. Jr.; Keyes, J.W. Jr.

    1987-01-01

    Quantitative organ visualization (QOV) was performed using single photon emission computed tomography (SPECT). Organ size was calculated from serial, contiguous ECT images taken through the organ of interest with image boundaries determined using a maximum directional gradient edge finding technique. Organ activity was calculated using ECT counts bounded by the directional gradient, imaging system efficiency, and imaging time. The technique used to perform QOV was evaluated using phantom studies, in vivo canine liver, spleen, bladder, and kidney studies, and in vivo human bladder studies. It was demonstrated that absolute organ activity and organ size could be determined with this system and total imaging time restricted to less than 45 min to an accuracy of about +/- 10% providing the minimum dimensions of the organ are greater than the FWHM of the imaging system and the total radioactivity within the organ of interest exceeds 15 nCi/cc for dog-sized torsos. In addition, effective half-lives of approximately 1.5 hr or greater could be determined

  7. Quantitative isotopes miction cystoureterography (QIMCU)

    International Nuclear Information System (INIS)

    Szy, D.A.G.; Stroetges, M.W.; Funke-Voelkers, R.

    1982-01-01

    A simple method for a quantitative evaluation of vesicoureteral reflux was developed. It allows the determination of a) the volume of reflux b) the volume of the bladder at each point of time during the examination. The QIMCU gives an insight into the dynamic of reflux, of reflux volume, and of actual bladder volume. The clinical application in 37 patients with 53 insufficient ureteral orifices (i.e. reflux) showed that the onset of reflux occured in 60% as early as in the first five minutes of the examination but later in the remaining 40%. The maximal reflux was found only in 26% during the first five minutes. The reflux volume exceeded in more than 50% the amount of 3.5 ml. The international grading corresponds with the reflux volume determined by this method. Radionuclide cystoureterography can be used as well in childhood as in adults. Because the radiaction exposure is low, the method can be recommended for the initial examination and for follow up studies. (Author)

  8. A quantitative philology of introspection

    Directory of Open Access Journals (Sweden)

    Carlos eDiuk

    2012-09-01

    Full Text Available The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the ``Axial Age'', saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy - which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single ``arrow of time'' in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the 20th century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus.

  9. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  10. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  11. Quantitative Ultrasound Measurements at the Heel

    DEFF Research Database (Denmark)

    Daugschies, M.; Brixen, K.; Hermann, P.

    2015-01-01

    Calcaneal quantitative ultrasound can be used to predict osteoporotic fracture risk, but its ability to monitor therapy is unclear possibly because of its limited precision. We developed a quantitative ultrasound device (foot ultrasound scanner) that measures the speed of sound at the heel...... with the foot ultrasound scanner reduced precision errors by half (p quantitative ultrasound measurements is feasible. (E-mail: m.daugschies@rad.uni-kiel.de) (C) 2015 World Federation for Ultrasound in Medicine & Biology....

  12. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  13. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  14. Quantitative PET of liver functions.

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[ 18 F]fluoro- D -galactose ( 18 F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value ( SUV ) from a static liver 18 F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11 C-palmitate and with the conjugated bile acid tracer [ N -methyl- 11 C]cholylsarcosine ( 11 C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood ( K 1 ; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion , SUV of non-invasive static PET with 18 F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET.

  15. Quantitative PET of liver functions

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[18F]fluoro-D-galactose (18F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value (SUV) from a static liver 18F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11C-palmitate and with the conjugated bile acid tracer [N-methyl-11C]cholylsarcosine (11C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood (K 1; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion, SUV of non-invasive static PET with 18F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET. PMID:29755841

  16. Validating quantitative precipitation forecast for the Flood ...

    Indian Academy of Sciences (India)

    In order to issue an accurate warning for flood, a better or appropriate quantitative forecasting of precipitationis required. In view of this, the present study intends to validate the quantitative precipitationforecast (QPF) issued during southwest monsoon season for six river catchments (basin) under theflood meteorological ...

  17. 78 FR 64202 - Quantitative Messaging Research

    Science.gov (United States)

    2013-10-28

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast- track OMB approval... comments. Please submit your comments using only one method and identify that it is for the ``Quantitative...

  18. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  19. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  20. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  1. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  2. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  3. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  4. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  5. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  6. Induced mutations for quantitative traits in rice

    International Nuclear Information System (INIS)

    Chakrabarti, B.N.

    1974-01-01

    The characteristics and frequency of micro-mutations induced in quantitative traits by radiation treatment and the extent of heterozygotic effects of different recessive chlorophyll-mutant-genes on quantitative trait has been presented. Mutagenic treatments increased the variance for quantitative traits in all cases although the magnitude of increase varied depending on the treatment and the selection procedure adopted. The overall superiority of the chlorophyll-mutant heterozygotes over the corresponding wild homozygotes, as noted in consecutive two seasons, was not observed when these were grown at a high level of nitrogen fertiliser. (author)

  7. Quantitative determination of uranium by SIMS

    International Nuclear Information System (INIS)

    Kuruc, J.; Harvan, D.; Galanda, D.; Matel, L.; Aranyosiova, M.; Velic, D.

    2008-01-01

    The paper presents results of quantitative measurements of uranium-238 by secondary ion mass spectrometry (SIMS) with using alpha spectrometry as well as complementary technique. Samples with specific activity of uranium-238 were prepared by electrodeposition from aqueous solution of UO 2 (NO 3 ) 2 ·6H 2 O. We tried to apply SIMS to quantitative analysis and search for correlation between intensity obtained from SIMS and activity of uranium-238 in dependence on the surface's weight and possibility of using SIMS in quantitative analysis of environmental samples. The obtained results and correlations as well as results of two real samples measurements are presented in this paper. (authors)

  8. Quantitation of isobaric phosphatidylcholine species in human plasma using a hybrid quadrupole linear ion-trap mass spectrometer

    Czech Academy of Sciences Publication Activity Database

    Žáček, Petr; Bukowski, M.; Rosenberger, T. A.; Picklo, M.

    2016-01-01

    Roč. 57, č. 12 (2016), s. 2225-2234 ISSN 0022-2275 Institutional support: RVO:61388963 Keywords : shotgun lipidomics * triple quadrupole/ion-trap * human blood plasma * phosphatidylcholines Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.810, year: 2016 http://www.jlr.org/content/57/12/2225.full

  9. Quantitative traits in wheat (Triticum aestivum L

    African Journals Online (AJOL)

    MSS

    2012-11-13

    Nov 13, 2012 ... Of the quantitative traits in wheat, spike length, number of spikes per m2, grain mass per spike, number ... design with four liming variants along with three replications, in which the experimental field .... The sampling was done.

  10. Quantitative Fundus Autofluorescence in Recessive Stargardt Disease

    OpenAIRE

    Burke, Tomas R.; Duncker, Tobias; Woods, Russell L.; Greenberg, Jonathan P.; Zernant, Jana; Tsang, Stephen H.; Smith, R. Theodore; Allikmets, Rando; Sparrow, Janet R.; Delori, François C.

    2014-01-01

    Quantitative fundus autofluorescence (qAF) is significantly increased in Stargardt disease, consistent with previous reports of increased RPE lipofuscin. QAF will help to establish genotype-phenotype correlations and may serve as an outcome measure in clinical trials.

  11. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  12. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. A Quantitative Technique for Beginning Microscopists.

    Science.gov (United States)

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  14. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  15. Quantitative data extraction from transmission electron micrographs

    International Nuclear Information System (INIS)

    Sprague, J.A.

    1982-01-01

    The discussion will cover an overview of quantitative TEM, the digital image analysis process, coherent optical processing, and finally a summary of the author's views on potentially useful advances in TEM image processing

  16. Quantitative Ability as Correlates of Students' Academic ...

    African Journals Online (AJOL)

    Nekky Umera

    The introduction of quantitative topics into the secondary school economics curriculum has ... since the quality of education at any level is highly dependent on the quality and dedication of ..... Ibadan: Constellations Books 466-481. Anderson ...

  17. Laboratory technique for quantitative thermal emissivity ...

    Indian Academy of Sciences (India)

    Emission of radiation from a sample occurs due to thermal vibration of its .... Quantitative thermal emissivity measurements of geological samples. 393. Figure 1. ...... tral mixture modeling: A new analysis of rock and soil types at the Viking ...

  18. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  19. Qualitative vs. quantitative atopic dermatitis criteria

    DEFF Research Database (Denmark)

    Andersen, R M; Thyssen, J P; Maibach, H I

    2016-01-01

    This review summarizes historical aspects, clinical expression and pathophysiology leading to coining of the terms atopy and atopic dermatitis, current diagnostic criteria and further explore the possibility of developing quantitative diagnostic criteria of atopic dermatitis (AD) based on the imp...

  20. Strategies for quantitation of phosphoproteomic data

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Thingholm, Tine Engberg

    2010-01-01

    Recent developments in phosphoproteomic sample-preparation techniques and sensitive mass spectrometry instrumentation have led to large-scale identifications of phosphoproteins and phosphorylation sites from highly complex samples. This has facilitated the implementation of different quantitation...

  1. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  2. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  3. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between...... climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...

  4. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  5. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  6. Quantitative autoradiography of semiconductor base material

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1983-01-01

    Autoradiographic methods for the quantitative determination of elements interesting in semiconductor technology and their distribution in silicon are described. Whereas the local concentration and distribution of phosphorus has been determined with the aid of silver halide films the neutron-induced autoradiography has been applied in the case of boron. Silicon disks containing diffused phosphorus or implanted or diffused boron have been used as standard samples. Different possibilities of the quantitative evaluation of autoradiograms are considered and compared

  7. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  8. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  9. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  10. Quantitative Appearance Inspection for Film Coated Tablets.

    Science.gov (United States)

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  11. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  12. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  13. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  14. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  15. Quantitative whole body scintigraphy - a simplified approach

    International Nuclear Information System (INIS)

    Marienhagen, J.; Maenner, P.; Bock, E.; Schoenberger, J.; Eilles, C.

    1996-01-01

    In this paper we present investigations on a simplified method of quantitative whole body scintigraphy by using a dual head LFOV-gamma camera and a calibration algorithm without the need of additional attenuation or scatter correction. Validation of this approach to the anthropomorphic phantom as well as in patient studies showed a high accuracy concerning quantification of whole body activity (102.8% and 97.72%, resp.), by contrast organ activities were recovered with an error range up to 12%. The described method can be easily performed using commercially available software packages and is recommendable especially for quantitative whole body scintigraphy in a clinical setting. (orig.) [de

  16. Aspects of quantitative secondary ion mass spectrometry

    International Nuclear Information System (INIS)

    Grauer, R.

    1982-05-01

    Parameters which have an influence on the formation of secondary ions by ion bombardment of a solid matrix are discussed. Quantitative SIMS-analysis with the help of calibration standards necessitates a stringent control of these parameters. This is particularly valid for the oxygen partial pressure which for metal analysis has to be maintained constant also under ultra high vacuum. The performance of the theoretical LTE-model (Local Thermal Equilibrium) using internal standards will be compared with the analysis with the help of external standards. The LTE-model does not satisfy the requirements for quantitative analysis. (Auth.)

  17. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  18. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  19. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibanez, Noelia; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp......Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  20. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  1. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  2. Quantitative Mapping of Large Area Graphene Conductance

    DEFF Research Database (Denmark)

    Buron, Jonas Christian Due; Petersen, Dirch Hjorth; Bøggild, Peter

    2012-01-01

    We present quantitative mapping of large area graphene conductance by terahertz time-domain spectroscopy and micro four point probe. We observe a clear correlation between the techniques and identify the observed systematic differences to be directly related to imperfections of the graphene sheet...

  3. The Sampling Issues in Quantitative Research

    Science.gov (United States)

    Delice, Ali

    2010-01-01

    A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…

  4. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  5. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  6. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  7. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp...

  8. Proteomic approaches for quantitative cancer cell signaling

    DEFF Research Database (Denmark)

    Voellmy, Franziska

    studies in an effort to contribute to the study of signaling dynamics in cancer systems. This thesis is divided into two parts. Part I begins with a brief introduction in the use of omics in systems cancer research with a focus on mass spectrometry as a means to quantitatively measure protein...

  9. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  10. Quantitative multiplex detection of pathogen biomarkers

    Energy Technology Data Exchange (ETDEWEB)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  11. Quantitative angiography after directional coronary atherectomy

    NARCIS (Netherlands)

    P.W.J.C. Serruys (Patrick); V.A.W.M. Umans (Victor); B.H. Strauss (Bradley); R-J. van Suylen (Robert-Jan); M.J.B.M. van den Brand (Marcel); H. Suryapranata (Harry); P.J. de Feyter (Pim); J.R.T.C. Roelandt (Jos)

    1991-01-01

    textabstractOBJECTIVE: To assess by quantitative analysis the immediate angiographic results of directional coronary atherectomy. To compare the effects of successful atherectomy with those of successful balloon dilatation in a series of patients with matched lesions. DESIGN--Case series.

  12. Deforestation since independence: A quantitative assessment of ...

    African Journals Online (AJOL)

    Deforestation since independence: A quantitative assessment of four decades of land-cover change in Malawi. ... pressure and demographic factors are important predictors of deforestation rate within our study area. Keywords: afforestation, Africa, deforestation, drivers, land-use change, reforestation, rural, urban ...

  13. Quantitative SPECT reconstruction of iodine-123 data

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1991-01-01

    Many clinical and research studies in nuclear medicine require quantitation of iodine-123 ( 123 I) distribution for the determination of kinetics or localization. The objective of this study was to implement several reconstruction methods designed for single-photon emission computed tomography (SPECT) using 123 I and to evaluate their performance in terms of quantitative accuracy, image artifacts, and noise. The methods consisted of four attenuation and scatter compensation schemes incorporated into both the filtered backprojection/Chang (FBP) and maximum likelihood-expectation maximization (ML-EM) reconstruction algorithms. The methods were evaluated on data acquired of a phantom containing a hot sphere of 123 I activity in a lower level background 123 I distribution and nonuniform density media. For both reconstruction algorithms, nonuniform attenuation compensation combined with either scatter subtraction or Metz filtering produced images that were quantitatively accurate to within 15% of the true value. The ML-EM algorithm demonstrated quantitative accuracy comparable to FBP and smaller relative noise magnitude for all compensation schemes

  14. Values in Qualitative and Quantitative Research

    Science.gov (United States)

    Duffy, Maureen; Chenail, Ronald J.

    2008-01-01

    The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…

  15. 78 FR 52166 - Quantitative Messaging Research

    Science.gov (United States)

    2013-08-22

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast-track OMB approval... message testing research (for which CFTC received fast-track OMB approval) and is necessary to identify...

  16. Quantitative grading of store separation trajectories

    CSIR Research Space (South Africa)

    Jamison, Kevin A

    2017-09-01

    Full Text Available . This paper describes the development of an automated analysis process and software that can run a multitude of separation scenarios. A key enabler for this software is the development of a quantitative grading algorithm that scores the outcome of each release...

  17. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  18. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  19. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  20. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  1. Quantitative multiplex detection of pathogen biomarkers

    Science.gov (United States)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  2. A quantitative lubricant test for deep drawing

    DEFF Research Database (Denmark)

    Olsson, David Dam; Bay, Niels; Andreasen, Jan L.

    2010-01-01

    A tribological test for deep drawing has been developed by which the performance of lubricants may be evaluated quantitatively measuring the maximum backstroke force on the punch owing to friction between tool and workpiece surface. The forming force is found not to give useful information...

  3. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  4. Quantitative muscle ultrasonography in amyotrophic lateral sclerosis.

    NARCIS (Netherlands)

    Arts, I.M.P.; Rooij, F.G. van; Overeem, S.; Pillen, S.; Janssen, H.M.; Schelhaas, H.J.; Zwarts, M.J.

    2008-01-01

    In this study, we examined whether quantitative muscle ultrasonography can detect structural muscle changes in early-stage amyotrophic lateral sclerosis (ALS). Bilateral transverse scans were made of five muscles or muscle groups (sternocleidomastoid, biceps brachii/brachialis, forearm flexor group,

  5. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Pieters, W.; Arnold, F.; Stoelinga, M.I.A.

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Therefore, penetration testing has thus far been used as a qualitative research method. To enable quantitative approaches to security risk management,

  6. QUANTITATIVE EXTRACTION OF MEIOFAUNA: A COMPARISON ...

    African Journals Online (AJOL)

    and A G DE WET. Department of Mathematical Statistics, University of Port Elizabeth. Accepted: May 1978. ABSTRACT. Two methods for the quantitative extraction of meiofauna from natural sandy sediments were investigated and compared: Cobb's decanting and sieving technique and the Oostenbrink elutriator. Both.

  7. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  8. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  9. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  10. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  11. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  12. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  13. Quantitative and Qualitative Extensions of Event Structures

    NARCIS (Netherlands)

    Katoen, Joost P.

    1996-01-01

    An important application of formal methods is the specification, design, and analysis of functional aspects of (distributed) systems. Recently the study of quantitative aspects of such systems based on formal methods has come into focus. Several extensions of formal methods where the occurrence of

  14. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  15. Quantitative Penetration Testing with Item Response Theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle Ida Antoinette

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  16. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  17. Engaging Business Students in Quantitative Skills Development

    Science.gov (United States)

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  18. Leaderless Covert Networks : A Quantitative Approach

    NARCIS (Netherlands)

    Husslage, B.G.M.; Lindelauf, R.; Hamers, H.J.M.

    2012-01-01

    Abstract: Lindelauf et al. (2009a) introduced a quantitative approach to investigate optimal structures of covert networks. This approach used an objective function which is based on the secrecy versus information trade-off these organizations face. Sageman (2008) hypothesized that covert networks

  19. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  20. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  1. Bringing quality and meaning to quantitative data - Bringing quantitative evidence to qualitative observation

    DEFF Research Database (Denmark)

    Karpatschof, Benny

    2007-01-01

    Based on the author's methodological theory defining the distinctive properties of quantitative and qualitative method the article demonstrates the possibilities and advantages of combining the two types of investigation in the same research project. The project being an effect study...

  2. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  3. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  4. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  5. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  6. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  7. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  8. Nanostructured surfaces investigated by quantitative morphological studies

    International Nuclear Information System (INIS)

    Perani, Martina; Carapezzi, Stefania; Mutta, Geeta Rani; Cavalcoli, Daniela

    2016-01-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height–height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO_xN_y, InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size. (paper)

  9. Quantitative phosphoproteomics to characterize signaling networks

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Blagoev, Blagoy

    2012-01-01

    for analyzing protein phosphorylation at a system-wide scale and has become the intuitive strategy for comprehensive characterization of signaling networks. Contemporary phosphoproteomics use highly optimized procedures for sample preparation, mass spectrometry and data analysis algorithms to identify......Reversible protein phosphorylation is involved in the regulation of most, if not all, major cellular processes via dynamic signal transduction pathways. During the last decade quantitative phosphoproteomics have evolved from a highly specialized area to a powerful and versatile platform...... and quantify thousands of phosphorylations, thus providing extensive overviews of the cellular signaling networks. As a result of these developments quantitative phosphoproteomics have been applied to study processes as diverse as immunology, stem cell biology and DNA damage. Here we review the developments...

  10. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1979-01-01

    Although the overall aim of radiobiology is to understand the biological effects of radiation, it also has the implied practical purpose of developing rational measures for the control of radiation exposure in man. The emphasis in this presentation is to show that the enormous effort expended over the years to develop quantitative dose-effect relationships in biochemical and cellular systems, animals, and human beings now seems to be paying off. The pieces appear to be falling into place, and a framework is evolving to utilize these data. Specifically, quantitative risk assessments will be discussed in terms of the cellular, animal, and human data on which they are based; their use in the development of radiation protection standards; and their present and potential impact and meaning in relation to the quantity dose equivalent and its special unit, the rem

  11. Quantitative sputter profiling at surfaces and interfaces

    International Nuclear Information System (INIS)

    Kirschner, J.; Etzkorn, H.W.

    1981-01-01

    The key problem in quantitative sputter profiling, that of a sliding depth scale has been solved by combined Auger/X-ray microanalysis. By means of this technique and for the model system Ge/Si (amorphous) the following questions are treated quantitatively: shape of the sputter profiles when sputtering through an interface and origin of their asymmetry; precise location of the interface plane on the depth profile; broadening effects due to limited depth of information and their correction; origin and amount of bombardment induced broadening for different primary ions and energies; depth dependence of the broadening, and basic limits to depth resolution. Comparisons are made to recent theoretical calculations based on recoil mixing in the collision cascade and very good agreement is found

  12. Quantitative image processing in fluid mechanics

    Science.gov (United States)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  13. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  14. Quantitative indicators of fruit and vegetable consumption

    OpenAIRE

    Dagmar Kozelová; Dana Országhová; Milan Fiľa; Zuzana Čmiková

    2015-01-01

    The quantitative research of the market is often based on surveys and questionnaires which are finding out the behavior of customers in observed areas. Before purchasing process consumers consider where they will buy fruit and vegetables, what kind to choose and in what quantity of goods. Consumers' behavior is affected by the factors as: regional gastronomic traditions, price, product appearance, aroma, place of buying, own experience and knowledge, taste preferences as well as specific heal...

  15. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  16. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  17. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  18. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  19. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  20. Quantitative maps of groundwater resources in Africa

    International Nuclear Information System (INIS)

    MacDonald, A M; Bonsor, H C; Dochartaigh, B É Ó; Taylor, R G

    2012-01-01

    In Africa, groundwater is the major source of drinking water and its use for irrigation is forecast to increase substantially to combat growing food insecurity. Despite this, there is little quantitative information on groundwater resources in Africa, and groundwater storage is consequently omitted from assessments of freshwater availability. Here we present the first quantitative continent-wide maps of aquifer storage and potential borehole yields in Africa based on an extensive review of available maps, publications and data. We estimate total groundwater storage in Africa to be 0.66 million km 3 (0.36–1.75 million km 3 ). Not all of this groundwater storage is available for abstraction, but the estimated volume is more than 100 times estimates of annual renewable freshwater resources on Africa. Groundwater resources are unevenly distributed: the largest groundwater volumes are found in the large sedimentary aquifers in the North African countries Libya, Algeria, Egypt and Sudan. Nevertheless, for many African countries appropriately sited and constructed boreholes can support handpump abstraction (yields of 0.1–0.3 l s −1 ), and contain sufficient storage to sustain abstraction through inter-annual variations in recharge. The maps show further that the potential for higher yielding boreholes ( > 5 l s −1 ) is much more limited. Therefore, strategies for increasing irrigation or supplying water to rapidly urbanizing cities that are predicated on the widespread drilling of high yielding boreholes are likely to be unsuccessful. As groundwater is the largest and most widely distributed store of freshwater in Africa, the quantitative maps are intended to lead to more realistic assessments of water security and water stress, and to promote a more quantitative approach to mapping of groundwater resources at national and regional level. (letter)

  1. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  2. Radioimmunoassay to quantitatively measure cell surface immunoglobulins

    International Nuclear Information System (INIS)

    Krishman, E.C.; Jewell, W.R.

    1975-01-01

    A radioimmunoassay techniques developed to quantitatively measure the presence of immunoglobulins on the surface of cells, is described. The amount of immunoglobulins found on different tumor cells varied from 200 to 1140 ng/10 6 cells. Determination of immunoglobulins on the peripheral lymphocytes obtained from different cancer patients varied between 340 to 1040 ng/10 6 cells. Cultured tumor cells, on the other hand, were found to contain negligible quantities of human IgG [pt

  3. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  4. Quantitative evaluation of dysphagia using scintigraphy

    International Nuclear Information System (INIS)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae

    1998-01-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes

  5. Quantitative evaluation of dysphagia using scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae [College of Medicine, Dankook Univ., Cheonnon (Korea, Republic of)

    1998-08-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes.

  6. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  7. Rational quantitative safety goals: a summary

    International Nuclear Information System (INIS)

    Unwin, S.D.; Hayns, M.R.

    1984-08-01

    We introduce the notion of a Rational Quantitative Safety Goal. Such a goal reflects the imprecision and vagueness inherent in any reasonable notion of adequate safety and permits such vagueness to be incorporated into the formal regulatory decision-making process. A quantitative goal of the form, the parameter x, characterizing the safety level of the nuclear plant, shall not exceed the value x 0 , for example, is of a non-rational nature in that it invokes a strict binary logic in which the parameter space underlying x is cut sharply into two portions: that containing those values of x that comply with the goal and that containing those that do not. Here, we utilize an alternative form of logic which, in accordance with any intuitively reasonable notion of safety, permits a smooth transition of a safety determining parameter between the adequately safe and inadequately safe domains. Fuzzy set theory provides a suitable mathematical basis for the formulation of rational quantitative safety goals. The decision-making process proposed here is compatible with current risk assessment techniques and produces results in a transparent and useful format. Our methodology is illustrated with reference to the NUS Corporation risk assessment of the Limerick Generating Station

  8. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Quantitative trait loci and metabolic pathways

    Science.gov (United States)

    McMullen, M. D.; Byrne, P. F.; Snook, M. E.; Wiseman, B. R.; Lee, E. A.; Widstrom, N. W.; Coe, E. H.

    1998-01-01

    The interpretation of quantitative trait locus (QTL) studies is limited by the lack of information on metabolic pathways leading to most economic traits. Inferences about the roles of the underlying genes with a pathway or the nature of their interaction with other loci are generally not possible. An exception is resistance to the corn earworm Helicoverpa zea (Boddie) in maize (Zea mays L.) because of maysin, a C-glycosyl flavone synthesized in silks via a branch of the well characterized flavonoid pathway. Our results using flavone synthesis as a model QTL system indicate: (i) the importance of regulatory loci as QTLs, (ii) the importance of interconnecting biochemical pathways on product levels, (iii) evidence for “channeling” of intermediates, allowing independent synthesis of related compounds, (iv) the utility of QTL analysis in clarifying the role of specific genes in a biochemical pathway, and (v) identification of a previously unknown locus on chromosome 9S affecting flavone level. A greater understanding of the genetic basis of maysin synthesis and associated corn earworm resistance should lead to improved breeding strategies. More broadly, the insights gained in relating a defined genetic and biochemical pathway affecting a quantitative trait should enhance interpretation of the biological basis of variation for other quantitative traits. PMID:9482823

  10. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  11. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  12. Some exercises in quantitative NMR imaging

    International Nuclear Information System (INIS)

    Bakker, C.J.G.

    1985-01-01

    The articles represented in this thesis result from a series of investigations that evaluate the potential of NMR imaging as a quantitative research tool. In the first article the possible use of proton spin-lattice relaxation time T 1 in tissue characterization, tumor recognition and monitoring tissue response to radiotherapy is explored. The next article addresses the question whether water proton spin-lattice relaxation curves of biological tissues are adequately described by a single time constant T 1 , and analyzes the implications of multi-exponentiality for quantitative NMR imaging. In the third article the use of NMR imaging as a quantitative research tool is discussed on the basis of phantom experiments. The fourth article describes a method which enables unambiguous retrieval of sign information in a set of magnetic resonance images of the inversion recovery type. The next article shows how this method can be adapted to allow accurate calculation of T 1 pictures on a pixel-by-pixel basis. The sixth article, finally, describes a simulation procedure which enables a straightforward determination of NMR imaging pulse sequence parameters for optimal tissue contrast. (orig.)

  13. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  14. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  15. Quantitative fluorescence microscopy and image deconvolution.

    Science.gov (United States)

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  16. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  17. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  18. Quantitative Imaging in Cancer Evolution and Ecology

    Science.gov (United States)

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral

  19. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  20. Quantitative safety goals for the regulatory process

    International Nuclear Information System (INIS)

    Joksimovic, V.; O'Donnell, L.F.

    1981-01-01

    The paper offers a brief summary of the current regulatory background in the USA, emphasizing nuclear, related to the establishment of quantitative safety goals as a way to respond to the key issue of 'how safe is safe enough'. General Atomic has taken a leading role in advocating the use of probabilistic risk assessment techniques in the regulatory process. This has led to understanding of the importance of quantitative safety goals. The approach developed by GA is discussed in the paper. It is centred around definition of quantitative safety regions. The regions were termed: design basis, safety margin or design capability and safety research. The design basis region is bounded by the frequency of 10 -4 /reactor-year and consequences of no identifiable public injury. 10 -4 /reactor-year is associated with the total projected lifetime of a commercial US nuclear power programme. Events which have a 50% chance of happening are included in the design basis region. In the safety margin region, which extends below the design basis region, protection is provided against some events whose probability of not happening during the expected course of the US nuclear power programme is within the range of 50 to 90%. Setting the lower mean frequency to this region of 10 -5 /reactor-year is equivalent to offering 90% assurance that an accident of given severity will not happen. Rare events with a mean frequency below 10 -5 can be predicted to occur. However, accidents predicted to have a probability of less than 10 -6 are 99% certain not to happen at all, and are thus not anticipated to affect public health and safety. The area between 10 -5 and 10 -6 defines the frequency portion of the safety research region. Safety goals associated with individual risk to a maximum-exposed member of public, general societal risk and property risk are proposed in the paper

  1. Quantitative imaging of turbulent and reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Paul, P.H. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    Quantitative digital imaging, using planar laser light scattering techniques is being developed for the analysis of turbulent and reacting flows. Quantitative image data, implying both a direct relation to flowfield variables as well as sufficient signal and spatial dynamic range, can be readily processed to yield two-dimensional distributions of flowfield scalars and in turn two-dimensional images of gradients and turbulence scales. Much of the development of imaging techniques to date has concentrated on understanding the requisite molecular spectroscopy and collision dynamics to be able to determine how flowfield variable information is encoded into the measured signal. From this standpoint the image is seen as a collection of single point measurements. The present effort aims at realizing necessary improvements in signal and spatial dynamic range, signal-to-noise ratio and spatial resolution in the imaging system as well as developing excitation/detection strategies which provide for a quantitative measure of particular flowfield scalars. The standard camera used for the study is an intensified CCD array operated in a conventional video format. The design of the system was based on detailed modeling of signal and image transfer properties of fast UV imaging lenses, image intensifiers and CCD detector arrays. While this system is suitable for direct scalar imaging, derived quantities (e.g. temperature or velocity images) require an exceptionally wide dynamic range imaging detector. To apply these diagnostics to reacting flows also requires a very fast shuttered camera. The authors have developed and successfully tested a new type of gated low-light level detector. This system relies on fast switching of proximity focused image-diode which is direct fiber-optic coupled to a cooled CCD array. Tests on this new detector show significant improvements in detection limit, dynamic range and spatial resolution as compared to microchannel plate intensified arrays.

  2. Quantitative microanalysis with a nuclear microprobe

    International Nuclear Information System (INIS)

    Themner, Klas.

    1989-01-01

    The analytical techniques of paticle induced X-ray emission (PIXE) and Rutherford backscattering (RBS), together with the nuclear microprobe, form a very powerful tool for performing quantitative microanalysis of biological material. Calibration of the X-ray detection system in the microprobe set-up has been performed and the accuracy of the quantitative procedure using RBS for determination of the areal mass density was investigated. The accuracy of the analysis can be affected by alteration in the elemental concentrations during irradiation due to the radiation damage induced by the very intense beams of ionixing radiation. Loss of matrix elements from freeze-dried tissue sections and polymer films have been studied during proton and photon irradiation and the effect on the accuracy discussed. Scanning the beam over an area of the target, with e.g. 32x32 pixels, in order to produce en elemental map, yields a lot of information and, to be able to make an accurate quantitatification, a fast algorithm using descriptions of the different spectral contributions is of need. The production of continuum X-rays by 2.55 MeV protons has been studied and absolute cross-sections for the bremsstrahlung production from thin carbon and some polymer films determined. For the determination of the bremsstrahlung background knowledge of the amounts of the matrix elements is important and a fast program for the evaluation of spectra of proton back- and forward scattering from biological samples has been developed. Quantitative microanalysis with the nuclear microprobe has been performed on brain tissue from rats subjected to different pathological conditions. Increase in calcium levels and decrease in potssium levels for animals subjected to crebral ischaemia and for animals suffering from epileptic seizures were observed coincidentally with or, in some cases before, visible signs of cell necrosis. (author)

  3. Quantitative transmission electron microscopy at atomic resolution

    International Nuclear Information System (INIS)

    Allen, L J; D'Alfonso, A J; Forbes, B D; Findlay, S D; LeBeau, J M; Stemmer, S

    2012-01-01

    In scanning transmission electron microscopy (STEM) it is possible to operate the microscope in bright-field mode under conditions which, by the quantum mechanical principle of reciprocity, are equivalent to those in conventional transmission electron microscopy (CTEM). The results of such an experiment will be presented which are in excellent quantitative agreement with theory for specimens up to 25 nm thick. This is at variance with the large contrast mismatch (typically between two and five) noted in equivalent CTEM experiments. The implications of this will be discussed.

  4. Quantitative spectrographic determination of zirconium minerals

    International Nuclear Information System (INIS)

    Rocal Adell, M.; Alvarez Gonzalez, F.; Fernandez Cellini, R.

    1958-01-01

    The method described in the following report permits the quantitative determination of zirconium in minerals and rocks in a 0,02-100% of ZrO 2 concentration rate. The excitation is carried out by a 10 ampere continuous current arc among carbon electrodes, and placing the sample in a crater of 2 mm depth. For low concentrations a dilution of the sample with the same weight as its own in carbon powder and with 1/25 of its weight of Co 3 O 4 (internal patron) is carried out. Line Zr 2571,4, Co 2585,3 and Co 2587,2 are used. (Author) 6 refs

  5. Quantitative angiography methods for bifurcation lesions

    DEFF Research Database (Denmark)

    Collet, Carlos; Onuma, Yoshinobu; Cavalcante, Rafael

    2017-01-01

    Bifurcation lesions represent one of the most challenging lesion subsets in interventional cardiology. The European Bifurcation Club (EBC) is an academic consortium whose goal has been to assess and recommend the appropriate strategies to manage bifurcation lesions. The quantitative coronary...... angiography (QCA) methods for the evaluation of bifurcation lesions have been subject to extensive research. Single-vessel QCA has been shown to be inaccurate for the assessment of bifurcation lesion dimensions. For this reason, dedicated bifurcation software has been developed and validated. These software...

  6. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  7. MR Fingerprinting for Rapid Quantitative Abdominal Imaging.

    Science.gov (United States)

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas

    2016-04-01

    To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue

  8. Chinese legal texts – Quantitative Description

    Directory of Open Access Journals (Sweden)

    Ľuboš GAJDOŠ

    2017-06-01

    Full Text Available The aim of the paper is to provide a quantitative description of legal Chinese. This study adopts the approach of corpus-based analyses and it shows basic statistical parameters of legal texts in Chinese, namely the length of a sentence, the proportion of part of speech etc. The research is conducted on the Chinese monolingual corpus Hanku. The paper also discusses the issues of statistical data processing from various corpora, e.g. the tokenisation and part of speech tagging and their relevance to study of registers variation.

  9. Enhancing quantitative approaches for assessing community resilience

    Science.gov (United States)

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  10. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  11. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1978-01-01

    The bases for developing quantitative assessment of exposure risks in the human being, and the several problems that accompany the assessment and introduction of the risk of exposure to high and low LET radiation into radiation protection, will be evaluated. The extension of the pioneering radiation protection philosophies to the control of other hazardous agents that cannot be eliminated from the environment will be discussed, as will the serious misunderstandings and misuse of concepts and facts that have inevitably surrounded the application to one agent alone, of the protection philosophy that must in time be applied to a broad spectrum of potentially hazardous agents. (orig.) [de

  12. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  13. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  14. A Quantitative Scale of Oxophilicity and Thiophilicity

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2016-01-01

    Oxophilicity and thiophilicity are widely used concepts with no quantitative definition. In this paper, a simple, generic scale is developed that solves issues with reference states and system dependencies and captures empirically known tendencies toward oxygen. This enables a detailed analysis......, ionic bonding is stronger to metals of low electronegativity. Left-side d-block elements with low effective nuclear charges and electro-negativities are thus highly oxophilic, and the f-block elements, not because of their hardness, which is normal, but as a result of the small ionization energies...

  15. Path to development of quantitative safety goals

    International Nuclear Information System (INIS)

    Joksimovic, V.; Houghton, W.J.

    1980-04-01

    There is a growing interest in defining numerical safety goals for nuclear power plants as exemplified by an ACRS recommendation. This paper proposes a lower frequency limit of approximately 10 -4 /reactor-year for design basis events. Below this frequency, down, to a small frequency such as 10 -5 /reactor-year, safety margin can be provided by, say, site emergency plans. Accident sequences below 10 -5 should not impact public safety, but it is prudent that safety research programs examine sequences with significant consequences. Once tentatively agreed upon, quantitative safety goals together with associated implementation tools would be factored into regulatory and design processes

  16. Expermental Studies of quantitative evaluation using HPLC

    Directory of Open Access Journals (Sweden)

    Ki Rok Kwon

    2005-06-01

    Full Text Available Methods : This study was conducted to carry out quantitative evaluation using HPLC Content analysis was done using HPLC Results : According to HPLC analysis, each BVA-1 contained approximately 0.36㎍ melittin, and BVA-2 contained approximately 0.54㎍ melittin. But the volume of coating was so minute, slight difference exists between each needle. Conclusion : Above results indicate that the bee venom acupuncture can complement shortcomings of syringe usage as a part of Oriental medicine treatment, but extensive researches should be done for further verification.

  17. Quantitative Assessment of the IT Agile Transformation

    Directory of Open Access Journals (Sweden)

    Orłowski Cezary

    2017-03-01

    Full Text Available The aim of this paper is to present the quantitative perspective of the agile transformation processes in IT organisations. The phenomenon of agile transformation becomes a complex challenge for an IT organisation since it has not been analysed in detail so far. There is no research on the readiness of IT organisations to realise agile transformation processes. Such processes also prove to have uncontrolled character. Therefore, to minimise the risk of failure referring to the realisation of transformation processes, it is necessary to monitor them. It is also necessary to identify and analyse such processes to ensure their continuous character.

  18. Quantitative Communication Research: Review, Trends, and Critique

    Directory of Open Access Journals (Sweden)

    Timothy R. Levine

    2013-01-01

    Full Text Available Trends in quantitative communication research are reviewed. A content analysis of 48 articles reporting original communication research published in 1988-1991 and 2008-2011 is reported. Survey research and self-report measurement remain common approaches to research. Null hypothesis significance testing remains the dominant approach to statistical analysis. Reporting the shapes of distributions, estimates of statistical power, and confidence intervals remain uncommon. Trends over time include the increased popularity of health communication and computer mediated communication as topics of research, and increased attention to mediator and moderator variables. The implications of these practices for scientific progress are critically discussed, and suggestions for the future are provided.

  19. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  20. Developing quantitative tools for measuring aspects of prisonization

    DEFF Research Database (Denmark)

    Kjær Minke, Linda

    2013-01-01

    The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners.......The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners....

  1. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  2. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  3. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  4. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  5. Fundamental quantitative security in quantum key generation

    International Nuclear Information System (INIS)

    Yuen, Horace P.

    2010-01-01

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.

  6. Quantitative image fusion in infrared radiometry

    Science.gov (United States)

    Romm, Iliya; Cukurel, Beni

    2018-05-01

    Towards high-accuracy infrared radiance estimates, measurement practices and processing techniques aimed to achieve quantitative image fusion using a set of multi-exposure images of a static scene are reviewed. The conventional non-uniformity correction technique is extended, as the original is incompatible with quantitative fusion. Recognizing the inherent limitations of even the extended non-uniformity correction, an alternative measurement methodology, which relies on estimates of the detector bias using self-calibration, is developed. Combining data from multi-exposure images, two novel image fusion techniques that ultimately provide high tonal fidelity of a photoquantity are considered: ‘subtract-then-fuse’, which conducts image subtraction in the camera output domain and partially negates the bias frame contribution common to both the dark and scene frames; and ‘fuse-then-subtract’, which reconstructs the bias frame explicitly and conducts image fusion independently for the dark and the scene frames, followed by subtraction in the photoquantity domain. The performances of the different techniques are evaluated for various synthetic and experimental data, identifying the factors contributing to potential degradation of the image quality. The findings reflect the superiority of the ‘fuse-then-subtract’ approach, conducting image fusion via per-pixel nonlinear weighted least squares optimization.

  7. QUANTITATIVE INDICATORS OF THE SECURITIZATION OF ASSETS

    Directory of Open Access Journals (Sweden)

    Denis VOSTRICOV

    2018-02-01

    Full Text Available Securitization is instrumental in return on capital increment through the withdrawal from the balance oflending activities being accompanied by off-balance incomes flow from fees, which are less capital-intensive. Thepurpose of this paper is to analyze the quantitative indicators characterizing the securitization of assets. For draftingthis article, the method of analysis, synthesis method, logic and dialectic method, normative method, the study ofstatistical sampling and time series of expert evaluations (Standard and Poor’s, personal observations, andmonographic studies have been used. The main difference between the securitization of assets from traditional waysof financing is related to the achievement of a plenty of secondary goals in attracting financial resources, whichcan play a significant role in choosing to favour the securitization of assets or other types of financing. Inparticular, it gives a possibility to write off the assets from the balance sheet along with the relevant obligationsunder the securities, to expand the range of potential investors accompanied by the reducing of credit risk, interestrate and liquidity risk, as well as to improve the management quality of assets, liabilities and risks. All of thesesecondary effects are achieved by the isolation of selected assets from the total credit risk of the enterprise, raisingits funds, which forms the pivotal actuality and significance of asset securitization. The article containsdemonstrations of quantitative and qualitative indicators characterizing the securitization of assets.

  8. Quantitating cellular immune responses to cancer vaccines.

    Science.gov (United States)

    Lyerly, H Kim

    2003-06-01

    While the future of immunotherapy in the treatment of cancer is promising, it is difficult to compare the various approaches because monitoring assays have not been standardized in approach or technique. Common assays for measuring the immune response need to be established so that these assays can one day serve as surrogate markers for clinical response. Assays that accurately detect and quantitate T-cell-mediated, antigen-specific immune responses are particularly desired. However, to date, increases in the number of cytotoxic T cells through immunization have not been correlated with clinical tumor regression. Ideally, then, a T-cell assay not only needs to be sensitive, specific, reliable, reproducible, simple, and quick to perform, it must also demonstrate close correlation with clinical outcome. Assays currently used to measure T-cell response are delayed-type hypersensitivity testing, flow cytometry using peptide major histocompatibility complex tetramers, lymphoproliferation assay, enzyme-linked immunosorbant assay, enzyme-linked immunospot assay, cytokine flow cytometry, direct cytotoxicity assay, measurement of cytokine mRNA by quantitative reverse transcriptase polymerase chain reaction, and limiting dilution analysis. The purpose of this review is to describe the attributes of each test and compare their advantages and disadvantages.

  9. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  10. Quantitative stratification of diffuse parenchymal lung diseases.

    Directory of Open Access Journals (Sweden)

    Sushravya Raghunath

    Full Text Available Diffuse parenchymal lung diseases (DPLDs are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients.

  11. Quantitative Stratification of Diffuse Parenchymal Lung Diseases

    Science.gov (United States)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Maldonado, Fabien; Peikert, Tobias; Moua, Teng; Ryu, Jay H.; Bartholmai, Brian J.; Robb, Richard A.

    2014-01-01

    Diffuse parenchymal lung diseases (DPLDs) are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes) and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients. PMID:24676019

  12. The Quantitative Nature of Autistic Social Impairment

    Science.gov (United States)

    Constantino, John N.

    2011-01-01

    Autism, like intellectual disability, represents the severe end of a continuous distribution of developmental impairments that occur in nature, that are highly inherited, and that are orthogonally related to other parameters of development. A paradigm shift in understanding the core social abnormality of autism as a quantitative trait rather than as a categorically-defined condition has key implications for diagnostic classification, the measurement of change over time, the search for underlying genetic and neurobiologic mechanisms, and public health efforts to identify and support affected children. Here a recent body of research in genetics and epidemiology is presented to examine a dimensional reconceptualization of autistic social impairment—as manifested in clinical autistic syndromes, the broader autism phenotype, and normal variation in the general population. It illustrates how traditional categorical approaches to diagnosis may lead to misclassification of subjects (especially girls and mildly affected boys in multiple-incidence autism families), which can be particularly damaging to biological studies, and proposes continued efforts to derive a standardized quantitative system by which to characterize this family of conditions. PMID:21289537

  13. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  14. Immune chromatography: a quantitative radioimmunological assay

    International Nuclear Information System (INIS)

    Davis, J.W.; Demetriades, M.; Bowen, J.M.

    1984-01-01

    Immune chromatography, a radioimmunological binding assay, employs paper chromatography to separate immune complexes from free antigen and antibodies. During chromatography free antigen and antibodies become distributed throughout the paper, while immune complexes remain near the bottoms of the strips. The chromatographic differences can be made quantitative by using either iodinated antigens or antibodies. Under these conditions nanogram quantities of antigen can be detected or antibodies in sera diluted several 1000-fold. The immune chromatography assay can also be performed as an indirect assay, since the paper strips are cut from nitrocellulose paper. In this case the immune components are absorbed by the paper during chromatography. Antigen is then detected with an iodinated second antibody. The indirect immune chromatography assay is particularly useful for identifying different sera that react with the same antigen. Reaction with the first serum before chromatography reduces the amount of antigen available to the second serum following chromatography. In addition to characterizing the immune chromatography procedure, we discuss the possible applications of chromatography assays for the quantitation of other types of molecular binding interactions. (Auth.)

  15. Quantitative fluorescence nanoscopy for cancer biomedicine

    Science.gov (United States)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  16. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  17. Quantitative diagnosis of skeletons with demineralizing osteopathy

    International Nuclear Information System (INIS)

    Banzer, D.

    1979-01-01

    The quantitative diagnosis of bone diseases must be assessed according to the accuracy of the applied method, the expense in apparatus, personnel and financial resources and the comparability of results. Nuclide absorptiometry and in the future perhaps computed tomography represent the most accurate methods for determining the mineral content of bones. Their application is the clinics' prerogative because of the costs. Morphometry provides quantiative information, in particular in course control, and enables an objective judgement of visual pictures. It requires little expenditure and should be combined with microradioscopy. Direct comparability of the findings of different working groups is most easy in morphometry; it depends on the equipment in computerized tomography and is still hardly possible in nuclide absorptiometry. For fundamental physical reason, it will hardly be possible to produce a low-cost, fast and easy-to-handle instrument for the determination of the mineral salt concentration in bones. Instead, there is rather a trend towards more expensive equipment, e.g. CT instruments; the universal use of these instruments, however, will help to promote quantitative diagnoses. (orig.) [de

  18. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  19. Quantitative fluorescence angiography for neurosurgical interventions.

    Science.gov (United States)

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  20. Developments in quantitative electron probe microanalysis

    International Nuclear Information System (INIS)

    Tixier, R.

    1977-01-01

    A study of the range of validity of the formulae for corrections used with massive specimen analysis is made. The method used is original; we have shown that it was possible to use a property of invariability of corrected intensity ratios for standards. This invariance property provides a test for the self consistency of the theory. The theoretical and experimental conditions required for quantitative electron probe microanalysis of thin transmission electron microscope specimens are examined. The correction formulae for atomic number, absorption and fluorescence effects are calculated. Several examples of experimental results are given, relative to the quantitative analysis of intermetallic precipitates and carbides in steels. Advances in applications of electron probe instruments related to the use of computer and the present development of fully automated instruments are reviewed. The necessary statistics for measurements of X ray count data are studied. Estimation procedure and tests are developed. These methods are used to perform a statistical check of electron probe microanalysis measurements and to reject rogue values. An estimator of the confidence interval of the apparent concentration is derived. Formulae were also obtained to optimize the counting time in order to obtain the best precision in a minimum amount of time [fr

  1. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  2. Quantitative evaluations of male pattern baldness.

    Science.gov (United States)

    Tsuji, Y; Ishino, A; Hanzawa, N; Uzuka, M; Okazaki, K; Adachi, K; Imamura, S

    1994-07-01

    Several methods for the evaluation of hair growth have been reported; however, none of the hitherto reported methods are satisfactory as unbiased double blind studies to evaluate the efficacy of hair growth agents. In the present paper, we describe quantitative evaluation methods for hair growth by measuring the anagen ratio and hair diameters in 56 Japanese subjects aged 23-56 for 3 years. The average anagen ratio decreased by 3.8% in 3 years. The average hair diameters showed a statistically significant decrease each year totalling 3.4 microns. Subjects were sorted according to their anagen ratio into 4 groups. Each group showed different distribution patterns of hair diameters. The higher anagen ratio group has a high frequency peak at thicker hair diameters and the lower anagen ratio group has a high frequency peak at thinner hair diameters. The number of thicker hairs decreased and the high frequency peak shifted to thinner hair diameters in 3 years. These methods are useful to evaluate both the progression of male pattern baldness and the effects of hair growth agents with double blind studies in an unbiased quantitative fashion.

  3. Technological innovation in neurosurgery: a quantitative study.

    Science.gov (United States)

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-07-01

    Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.

  4. Quantitative assessment of growth plate activity

    International Nuclear Information System (INIS)

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies

  5. Quantitative tomographic measurements of opaque multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN; O' HERN,TIMOTHY J.; CECCIO,STEVEN L.

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDT and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.

  6. Allometric trajectories and "stress": a quantitative approach

    Directory of Open Access Journals (Sweden)

    Tommaso Anfodillo

    2016-11-01

    Full Text Available The term stress is an important but vague term in plant biology. We show situations in which thinking in terms of stress is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between source and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, too little leaf area (e.g. due to herbivory or disease per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to stress, without need for recourse to this term. Our approach contrasts with traditional approaches for studying stress, e.g. revealing that small stressed plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as stress, plasticity, adaptation, and acclimation.

  7. Allometric Trajectories and "Stress": A Quantitative Approach.

    Science.gov (United States)

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  8. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  9. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  10. Another Curriculum Requirement? Quantitative Reasoning in Economics: Some First Steps

    Science.gov (United States)

    O'Neill, Patrick B.; Flynn, David T.

    2013-01-01

    In this paper, we describe first steps toward focusing on quantitative reasoning in an intermediate microeconomic theory course. We find student attitudes toward quantitative aspects of economics improve over the duration of the course (as we would hope). Perhaps more importantly, student attitude toward quantitative reasoning improves, in…

  11. Quantitative Literacy Courses as a Space for Fusing Literacies

    Science.gov (United States)

    Tunstall, Samuel Luke; Matz, Rebecca L.; Craig, Jeffrey C.

    2016-01-01

    In this article, we examine how students in a general education quantitative literacy course reason with public issues when unprompted to use quantitative reasoning. Michigan State University, like many institutions, not only has a quantitative literacy requirement for all undergraduates but also offers two courses specifically for meeting the…

  12. Whole-genome shotgun optical mapping of rhodospirillumrubrum

    Energy Technology Data Exchange (ETDEWEB)

    Reslewic, Susan; Zhou, Shiguo; Place, Mike; Zhang, Yaoping; Briska, Adam; Goldstein, Steve; Churas, Chris; Runnheim, Rod; Forrest,Dan; Lim, Alex; Lapidus, Alla; Han, Cliff S.; Roberts, Gary P.; Schwartz,David C.

    2004-07-01

    Rhodospirillum rubrum is a phototrophic purple non-sulfur bacterium known for its unique and well-studied nitrogen fixation and carbon monoxide oxidation systems, and as a source of hydrogen and biodegradable plastics production. To better understand this organism and to facilitate assembly of its sequence, three whole-genome restriction maps (Xba I, Nhe I, and Hind III) of R. rubrum strain ATCC 11170 were created by optical mapping. Optical mapping is a system for creating whole-genome ordered restriction maps from randomly sheared genomic DNA molecules extracted directly from cells. During the sequence finishing process, all three optical maps confirmed a putative error in sequence assembly, while the Hind III map acted as a scaffold for high resolution alignment with sequence contigs spanning the whole genome. In addition to highlighting optical mapping's role in the assembly and validation of genome sequence, our work underscores the unique niche in resolution occupied by the optical mapping system. With a resolution ranging from 6.5 kb (previously published) to 45 kb (reported here), optical mapping advances a ''molecular cytogenetics'' approach to solving problems in genomic analysis.

  13. Whole-genome shotgun optical mapping of Rhodospirillum rubrum

    Energy Technology Data Exchange (ETDEWEB)

    Reslewic, S. [Univ. Wisc.-Madison; Zhou, S. [Univ. Wisc.-Madison; Place, M. [Univ. Wisc.-Madison; Zhang, Y. [Univ. Wisc.-Madison; Briska, A. [Univ. Wisc.-Madison; Goldstein, S. [Univ. Wisc.-Madison; Churas, C. [Univ. Wisc.-Madison; Runnheim, R. [Univ. Wisc.-Madison; Forrest, D. [Univ. Wisc.-Madison; Lim, A. [Univ. Wisc.-Madison; Lapidus, A. [Univ. Wisc.-Madison; Han, C. S. [Univ. Wisc.-Madison; Roberts, G. P. [Univ. Wisc.-Madison; Schwartz, D. C. [Univ. Wisc.-Madison

    2005-09-01

    Rhodospirillum rubrum is a phototrophic purple nonsulfur bacterium known for its unique and well-studied nitrogen fixation and carbon monoxide oxidation systems and as a source of hydrogen and biodegradable plastic production. To better understand this organism and to facilitate assembly of its sequence, three whole-genome restriction endonuclease maps (XbaI, NheI, and HindIII) of R. rubrum strain ATCC 11170 were created by optical mapping. Optical mapping is a system for creating whole-genome ordered restriction endonuclease maps from randomly sheared genomic DNA molecules extracted from cells. During the sequence finishing process, all three optical maps confirmed a putative error in sequence assembly, while the HindIII map acted as a scaffold for high-resolution alignment with sequence contigs spanning the whole genome. In addition to highlighting optical mapping's role in the assembly and confirmation of genome sequence, this work underscores the unique niche in resolution occupied by the optical mapping system. With a resolution ranging from 6.5 kb (previously published) to 45 kb (reported here), optical mapping advances a "molecular cytogenetics" approach to solving problems in genomic analysis.

  14. Videodensitometric quantitative angiography after coronary balloon angioplasty, compared to edge-detection quantitative angiography and intracoronary ultrasound imaging

    NARCIS (Netherlands)

    Peters, R. J.; Kok, W. E.; Pasterkamp, G.; von Birgelen, C.; Prins, M. [=Martin H.; Serruys, P. W.

    2000-01-01

    AIMS: To assess the value of videodensitometric quantification of the coronary lumen after angioplasty by comparison to two other techniques of coronary artery lumen quantification. METHODS AND RESULTS: Videodensitometric quantitative angiography, edge detection quantitative angiography and 30 MHz

  15. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  16. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  17. Real time quantitative amplification detection on a microarray: towards high multiplex quantitative PCR.

    NARCIS (Netherlands)

    Pierik, A.; Moamfa, M; van Zelst, M.; Clout, D.; Stapert, H.; Dijksman, Johan Frederik; Broer, D.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  18. Real time quantitative amplification detection on a microarray : towards high multiplex quantitative PCR

    NARCIS (Netherlands)

    Pierik, Anke; Boamfa, M.; Zelst, van M.; Clout, D.; Stapert, H.R.; Dijksman, J.F.; Broer, D.J.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  19. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  20. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    International Nuclear Information System (INIS)

    Lasaygues, Philippe; Ouedraogo, Edgard; Lefebvre, Jean-Pierre; Gindre, Marcel; Talmant, Marilyne; Laugier, Pascal

    2005-01-01

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol

  1. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  2. Quantitative recurrence for free semigroup actions

    Science.gov (United States)

    Carvalho, Maria; Rodrigues, Fagner B.; Varandas, Paulo

    2018-03-01

    We consider finitely generated free semigroup actions on a compact metric space and obtain quantitative information on Poincaré recurrence, average first return time and hitting frequency for the random orbits induced by the semigroup action. Besides, we relate the recurrence to balls with the rates of expansion of the semigroup generators and the topological entropy of the semigroup action. Finally, we establish a partial variational principle and prove an ergodic optimization for this kind of dynamical action. MC has been financially supported by CMUP (UID/MAT/00144/2013), which is funded by FCT (Portugal) with national (MEC) and European structural funds (FEDER) under the partnership agreement PT2020. FR and PV were partially supported by BREUDS. PV has also benefited from a fellowship awarded by CNPq-Brazil and is grateful to the Faculty of Sciences of the University of Porto for the excellent research conditions.

  3. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  4. Quantitative radiation monitors for containment and surveillance

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1983-01-01

    Quantitative radiation monitors make it possible to differentiate between shielded and unshielded nuclear materials. The hardness of the gamma-ray spectrum is the attribute that characterizes bare or shielded material. Separate high- and low-energy gamma-ray regions are obtained from a single-channel analyzer through its window and discriminator outputs. The monitor counts both outputs and computes a ratio of the high- and low-energy region counts whenever an alarm occurs. The ratio clearly differentiates between shielded and unshielded nuclear material so that the net alarm count may be identified with a small quantity of unshielded material or a large quantity of shielded material. Knowledge of the diverted quantity helps determine whether an inventory should be called to identify the loss

  5. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  6. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  7. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  8. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  9. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  10. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  11. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  12. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  13. Quantitative Accelerated Life Testing of MEMS Accelerometers.

    Science.gov (United States)

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-11-20

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature) and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the"worst case" being smaller than 10 -7 h -1 .

  14. Nonparametric functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yang, Jie; Wu, Rongling; Casella, George

    2009-03-01

    Functional mapping is a useful tool for mapping quantitative trait loci (QTL) that control dynamic traits. It incorporates mathematical aspects of biological processes into the mixture model-based likelihood setting for QTL mapping, thus increasing the power of QTL detection and the precision of parameter estimation. However, in many situations there is no obvious functional form and, in such cases, this strategy will not be optimal. Here we propose to use nonparametric function estimation, typically implemented with B-splines, to estimate the underlying functional form of phenotypic trajectories, and then construct a nonparametric test to find evidence of existing QTL. Using the representation of a nonparametric regression as a mixed model, the final test statistic is a likelihood ratio test. We consider two types of genetic maps: dense maps and general maps, and the power of nonparametric functional mapping is investigated through simulation studies and demonstrated by examples.

  15. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  16. Quantitative MFM on superconducting thin films

    Energy Technology Data Exchange (ETDEWEB)

    Stopfel, Henry; Vock, Silvia; Shapoval, Tetyana; Neu, Volker; Wolff, Ulrike; Haindl, Silvia; Engelmann, Jan; Schaefer, Rudolf; Holzapfel, Bernhard; Schultz, Ludwig [IFW Dresden, Institute for Metallic Material (Germany); Inosov, Dmytro S. [Max Planck Institute for Solid State Research, Stuttgart (Germany)

    2012-07-01

    Quantitative interpretation of magnetic force microscopy (MFM) data is a challenge, because the measured signal is a convolution between the magnetization of the tip and the stray field emanated by the sample. It was established theoretically that the field distribution just above the surface of the superconductor can be well approximated by the stray field of a magnetic monopole. The description of the MFM tip, however, needs a second approximation. The temperature-dependent vortex-distribution images on a NbN thin film were fitted using two different tip models. Firstly, the magnetic tip was assumed to be a monopole that leads to the simple monopole-monopole model for the tip-sample interaction force. Performing a 2D fitting of the data with this model, we extracted λ, Δ and the vortex pinning force. Secondly, a geometrical model was applied to calculate the tip-transfer-function of the MFM tip using the numerical BEM method.

  17. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  18. Safety culture management and quantitative indicator evaluation

    International Nuclear Information System (INIS)

    Mandula, J.

    2002-01-01

    This report discuses a relationship between safety culture and evaluation of quantitative indicators. It shows how a systematic use of generally shared operational safety indicators may contribute to formation and reinforcement of safety culture characteristics in routine plant operation. The report also briefly describes the system of operational safety indicators used at the Dukovany plant. It is a PC database application enabling an effective work with the indicators and providing all users with an efficient tool for making synoptic overviews of indicator values in their links and hierarchical structure. Using color coding, the system allows quick indicator evaluation against predefined limits considering indicator value trends. The system, which has resulted from several-year development, was completely established at the plant during the years 2001 and 2002. (author)

  19. Geomorphology: now a more quantitative science

    International Nuclear Information System (INIS)

    Lal, D.

    1995-01-01

    Geomorphology, one of the oldest branches of planetary science, is now growing into a quantitative field with the development of a nuclear method capable of providing numeric time controls on a great variety of superficial processes. The method complement the conventional dating methods, e.g. 40 K/ 40 Ar, 87 Rb/ 87 Sr, by providing information on geomorphic processes., e.g. the dwell times of rocks on the earth's surface with strict geometrical constraints; e.g., rates of physical and chemical weathering in the past, chronology of events associated with glaciation, etc. This article attempts to discuss the new possibilities that now exist for studying a wide range of geomorphic processes, with examples of some specific isotopic changes that allow one to model glacial chronology, and evolutionary histories of alluvial fans and sand dunes. (author). 9 refs., 3 figs., 4 tabs

  20. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  1. Quantitative measurement of the cerebral blood flow

    International Nuclear Information System (INIS)

    Houdart, R.; Mamo, H.; Meric, P.; Seylaz, J.

    1976-01-01

    The value of the cerebral blood flow measurement (CBF) is outlined, its limits are defined and some future prospects discussed. The xenon 133 brain clearance study is at present the most accurate quantitative method to evaluate the CBF in different regions of the brain simultaneously. The method and the progress it has led to in the physiological, physiopathological and therapeutic fields are described. The major disadvantage of the method is shown to be the need to puncture the internal carotid for each measurement. Prospects are discussed concerning methods derived from the same general principle but using a simpler, non-traumatic way to introduce the radio-tracer, either by breathing into the lungs or intraveinously [fr

  2. Quantitative Susceptibility Mapping in Parkinson's Disease.

    Science.gov (United States)

    Langkammer, Christian; Pirpamer, Lukas; Seiler, Stephan; Deistung, Andreas; Schweser, Ferdinand; Franthal, Sebastian; Homayoon, Nina; Katschnig-Winter, Petra; Koegl-Wallner, Mariella; Pendl, Tamara; Stoegerer, Eva Maria; Wenzel, Karoline; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen Rainer; Schmidt, Reinhold; Schwingenschuh, Petra

    2016-01-01

    Quantitative susceptibility mapping (QSM) and R2* relaxation rate mapping have demonstrated increased iron deposition in the substantia nigra of patients with idiopathic Parkinson's disease (PD). However, the findings in other subcortical deep gray matter nuclei are converse and the sensitivity of QSM and R2* for morphological changes and their relation to clinical measures of disease severity has so far been investigated only sparsely. The local ethics committee approved this study and all subjects gave written informed consent. 66 patients with idiopathic Parkinson's disease and 58 control subjects underwent quantitative MRI at 3T. Susceptibility and R2* maps were reconstructed from a spoiled multi-echo 3D gradient echo sequence. Mean susceptibilities and R2* rates were measured in subcortical deep gray matter nuclei and compared between patients with PD and controls as well as related to clinical variables. Compared to control subjects, patients with PD had increased R2* values in the substantia nigra. QSM also showed higher susceptibilities in patients with PD in substantia nigra, in the nucleus ruber, thalamus, and globus pallidus. Magnetic susceptibility of several of these structures was correlated with the levodopa-equivalent daily dose (LEDD) and clinical markers of motor and non-motor disease severity (total MDS-UPDRS, MDS-UPDRS-I and II). Disease severity as assessed by the Hoehn & Yahr scale was correlated with magnetic susceptibility in the substantia nigra. The established finding of higher R2* rates in the substantia nigra was extended by QSM showing superior sensitivity for PD-related tissue changes in nigrostriatal dopaminergic pathways. QSM additionally reflected the levodopa-dosage and disease severity. These results suggest a more widespread pathologic involvement and QSM as a novel means for its investigation, more sensitive than current MRI techniques.

  3. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  4. Reproducibility of quantitative planar thallium-201 scintigraphy: quantitative criteria for reversibility of myocardial perfusion defects

    International Nuclear Information System (INIS)

    Sigal, S.L.; Soufer, R.; Fetterman, R.C.; Mattera, J.A.; Wackers, F.J.

    1991-01-01

    Fifty-two paired stress/delayed planar 201 TI studies (27 exercise studies, 25 dipyridamole studies) were processed twice by seven technologists to assess inter- and intraobserver variability. The reproducibility was inversely related to the size of 201 Tl perfusion abnormalities. Intraobserver variability was not different between exercise and dipyridamole studies for lesions of similar size. Based upon intraobserver variability, objective quantitative criteria for reversibility of perfusion abnormalities were defined. These objective criteria were tested prospectively in a separate group of 35 201 Tl studies and compared with the subjective interpretation of quantitative circumferential profiles. Overall, exact agreement existed in 78% of images (kappa statistic k = 0.66). We conclude that quantification of planar 201 Tl scans is highly reproducible, with acceptable inter- and intraobserver variability. Objective criteria for lesion reversibility correlated well with analysis by experienced observers

  5. Evolutionary Quantitative Genomics of Populus trichocarpa.

    Directory of Open Access Journals (Sweden)

    Ilga Porth

    Full Text Available Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance was investigated for signatures of selection (comparing QST-FST using clustering of individuals by climate of origin (temperature and precipitation. 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation; 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes were associated with adaptive traits (based on significant QST. Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show

  6. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  7. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  8. Progress in quantitative GPR development at CNDE

    Energy Technology Data Exchange (ETDEWEB)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott [Center for Nondestructive Evaluation, Iowa State University, 1915 Scholl Road, Ames, IA 50011-3042 (United States)

    2014-02-18

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  9. Quantitative Ultrasond in the assessment of Osteoporosis

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Terlizzi, Francesca de

    2009-01-01

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  10. Quantitation of vitamin K in human milk

    International Nuclear Information System (INIS)

    Canfield, L.M.; Hopkinson, J.M.; Lima, A.F.; Martin, G.S.; Sugimoto, K.; Burr, J.; Clark, L.; McGee, D.L.

    1990-01-01

    A quantitative method was developed for the assay of vitamin K in human colostrum and milk. The procedure combines preparative and analytical chromatography on silica gel in a nitrogen atmosphere followed by reversed phase high performance liquid chromatography (HPLC). Two HPLC steps were used: gradient separation with ultraviolet (UV) detection followed by isocratic separation detected electrochemically. Due to co-migrating impurities, UV detection alone is insufficient for identification of vitamin K. Exogenous vitamin K was shown to equilibrate with endogenous vitamin K in the samples. A statistical method was incorporated to control for experimental variability. Vitamin K1 was analyzed in 16 pooled milk samples from 7 donors and in individual samples from 15 donors at 1 month post-partum. Vitamin K1 was present at 2.94 +/- 1.94 and 3.15 +/- 2.87 ng/mL in pools and in individuals, respectively. Menaquinones, the bacterial form of the vitamin, were not detected. The significance of experimental variation to studies of vitamin K in individuals is discussed

  11. Quantitative theory of driven nonlinear brain dynamics.

    Science.gov (United States)

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Quantitative risk assessment of drinking water contaminants

    International Nuclear Information System (INIS)

    Cothern, C.R.; Coniglio, W.A.; Marcus, W.L.

    1986-01-01

    The development of criteria and standards for the regulation of drinking water contaminants involves a variety of processes, one of which is risk estimation. This estimation process, called quantitative risk assessment, involves combining data on the occurrence of the contaminant in drinking water and its toxicity. The human exposure to a contaminant can be estimated from occurrence data. Usually the toxicity or number of health effects per concentration level is estimated from animal bioassay studies using the multistage model. For comparison, other models will be used including the Weibull, probit, logit and quadratic ones. Because exposure and toxicity data are generally incomplete, assumptions need to be made and this generally results in a wide range of certainty in the estimates. This range can be as wide as four to six orders of magnitude in the case of the volatile organic compounds in drinking water and a factor of four to five for estimation of risk due to radionuclides in drinking water. As examples of the differences encountered in risk assessment of drinking water contaminants, discussions are presented on benzene, lead, radon and alachlor. The lifetime population risk estimates for these contaminants are, respectively, in the ranges of: <1 - 3000, <1 - 8000, 2000-40,000 and <1 - 80. 11 references, 1 figure, 1 table

  13. Progress in quantitative GPR development at CNDE

    Science.gov (United States)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-02-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  14. Progress in quantitative GPR development at CNDE

    International Nuclear Information System (INIS)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-01-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability

  15. Quantitative phenotyping via deep barcode sequencing.

    Science.gov (United States)

    Smith, Andrew M; Heisler, Lawrence E; Mellor, Joseph; Kaper, Fiona; Thompson, Michael J; Chee, Mark; Roth, Frederick P; Giaever, Guri; Nislow, Corey

    2009-10-01

    Next-generation DNA sequencing technologies have revolutionized diverse genomics applications, including de novo genome sequencing, SNP detection, chromatin immunoprecipitation, and transcriptome analysis. Here we apply deep sequencing to genome-scale fitness profiling to evaluate yeast strain collections in parallel. This method, Barcode analysis by Sequencing, or "Bar-seq," outperforms the current benchmark barcode microarray assay in terms of both dynamic range and throughput. When applied to a complex chemogenomic assay, Bar-seq quantitatively identifies drug targets, with performance superior to the benchmark microarray assay. We also show that Bar-seq is well-suited for a multiplex format. We completely re-sequenced and re-annotated the yeast deletion collection using deep sequencing, found that approximately 20% of the barcodes and common priming sequences varied from expectation, and used this revised list of barcode sequences to improve data quality. Together, this new assay and analysis routine provide a deep-sequencing-based toolkit for identifying gene-environment interactions on a genome-wide scale.

  16. Qualitative and quantitative descriptions of glenohumeral motion.

    Science.gov (United States)

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  17. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  18. Limits of qualitative detection and quantitative determination

    International Nuclear Information System (INIS)

    Curie, L.A.

    1976-01-01

    The fact that one can find a series of disagreeing and limiting definitions of the detection limit leads to the reinvestigation of the problems of signal detection and signal processing in analytical and nuclear chemistry. Three cut-off levels were fixed: Lsub(C) - the net signal level (sensitivity of the equipment), above which an observed signal can be reliably recognized as 'detected'; Lsub(D) - the 'true' net signal level, from which one can a priori expect a detection; Lsub(Q) - the level at which the measuring accuracy is sufficient for quantitative determination. Exact definition equations as well as a series of working formulae are given for the general analytical case and for the investigation of radioactivity. As it is assumed that the radioactivity of the Poisson distribution is determined, it is dealt with in such a manner that precise limits can be derived for short-lived and long-lived radionuclides with or without disturbance. The fundamentals are made clear by simple examples for spectrophotometry and radioactivity and by a complicated example for activation analysis in which one must choose between alternative nuclear reactions. (orig./LH) [de

  19. Precision of different quantitative ultrasound densitometers

    International Nuclear Information System (INIS)

    Pocock, N.A.; Harris, N.D.; Griffiths, M.R.

    1998-01-01

    Full text: Quantitative ultrasound (QUS) of the calcaneus, which measures Speed of Sound (SOS) and Broadband ultrasound attenuation (BUA), is predictive of the risk of osteoporotic fracture. However, the utility of QUS for predicting fracture risk or for monitoring treatment efficacy depends on its precision and reliability. Published results and manufacturers data vary significantly due to differences in statistical methodology. We have assessed the precision of the current model of the Lunar Achilles and the McCue Cuba QUS densitometers; the most commonly used QUS machines in Australia. Twenty seven subjects had duplicate QUS measurements performed on the same day on both machines. These data were used to calculate the within pair standard deviation (SD) the co-efficient of variation, CV and the standardised co efficient of variation (sCV) which is corrected for the dynamic range. In addition, the co-efficient of reliability (R) was calculated as an index of reliability which is independent of the population mean value, or the dynamic range of the measurements. R ranges between 0 (for no reliability) to 1(for a perfect reliability). The results indicate that the precision of QUS is dependent on the dynamic range and the instrument. Furthermore, they suggest that while QUS is a useful predictor of fracture risk, at present it has limited clinical value in monitoring short term age-related bone loss of 1-2% per year

  20. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    scanner for pulsating flow mimicking the femoral artery from a CompuFlow 1000 pump (Shelley Medical). Data were used in four estimators based on directional transverse oscillation for velocity, flow angle, volume flow, and turbulence estimation and their respective precisions. An adaptive lag scheme gave...... the ability to estimate a large velocity range, or alternatively measure at two sites to find e.g. stenosis degree in a vessel. The mean angle at the vessel center was estimated to 90.9◦±8.2◦ indicating a laminar flow from a turbulence index being close to zero (0.1 ±0.1). Volume flow was 1.29 ±0.26 mL/stroke...... (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently...

  1. Quantitative NDE of Composite Structures at NASA

    Science.gov (United States)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  2. Neuropathic pain: is quantitative sensory testing helpful?

    Science.gov (United States)

    Krumova, Elena K; Geber, Christian; Westermann, Andrea; Maier, Christoph

    2012-08-01

    Neuropathic pain arises as a consequence of a lesion or disease affecting the somatosensory system and is characterised by a combination of positive and negative sensory symptoms. Quantitative sensory testing (QST) examines the sensory perception after application of different mechanical and thermal stimuli of controlled intensity and the function of both large (A-beta) and small (A-delta and C) nerve fibres, including the corresponding central pathways. QST can be used to determine detection, pain thresholds and stimulus-response curves and can thus detect both negative and positive sensory signs, the second ones not being assessed by other methods. Similarly to all other psychophysical tests QST requires standardised examination, instructions and data evaluation to receive valid and reliable results. Since normative data are available, QST can contribute also to the individual diagnosis of neuropathy, especially in the case of isolated small-fibre neuropathy, in contrast to the conventional electrophysiology which assesses only large myelinated fibres. For example, detection of early stages of subclinical neuropathy in symptomatic or asymptomatic patients with diabetes mellitus can be helpful to optimise treatment and identify diabetic foot at risk of ulceration. QST assessed the individual's sensory profile and thus can be valuable to evaluate the underlying pain mechanisms which occur in different frequencies even in the same neuropathic pain syndromes. Furthermore, assessing the exact sensory phenotype by QST might be useful in the future to identify responders to certain treatments in accordance to the underlying pain mechanisms.

  3. Quantitative topographic differentiation of the neonatal EEG.

    Science.gov (United States)

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  4. Marketing communications: Qualitative and quantitative paradigm

    Directory of Open Access Journals (Sweden)

    Uzelac Nikola

    2005-01-01

    Full Text Available This paper focuses on key issues in relation to the choice of basic language of communication of marketing as a practical and academic field. Principally, marketing managers prefer descriptive way of expression, but they should use the advantages of language of numbers much more. By doing so, they will advance decision-making process - and the communication with finance and top management. In this regard, models offered by academic community could be helpful. This especially pertains to those positive or normative verbal approaches and models in which mathematics and statistical solutions have been embedded, as well as to those which emphasize financial criteria in decision-making. Concerning the process of creation and verification of scientific knowledge, the choice between languages of words and numbers is the part of much wider dimension, because it is inseparable from the decision on basic research orientation. Quantitative paradigm is more appropriate for hypotheses testing, while qualitative paradigm gives greater contribution in their generation. Competition factor could become the key driver of changes by which existing "parallel worlds" of main paradigms would be integrating, for the sake of disciplinary knowledge advancement.

  5. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  6. Individual patient dosimetry using quantitative SPECT imaging

    International Nuclear Information System (INIS)

    Gonzalez, J.; Oliva, J.; Baum, R.; Fisher, S.

    2002-01-01

    An approach is described to provide individual patient dosimetry for routine clinical use. Accurate quantitative SPECT imaging was achieved using appropriate methods. The volume of interest (VOI) was defined semi-automatically using a fixed threshold value obtained from phantom studies. The calibration factor to convert the voxel counts from SPECT images into activity values was determine from calibrated point source using the same threshold value as in phantom studies. From selected radionuclide the dose within and outside a sphere of voxel dimension at different distances was computed through dose point-kernels to obtain a discrete absorbed dose kernel representation around the volume source with uniform activity distribution. The spatial activity distribution from SPECT imaging was convolved with this kernel representation using the discrete Fourier transform method to yield three-dimensional absorbed dose rate distribution. The accuracy of dose rates calculation was validated by software phantoms. The absorbed dose was determined by integration of the dose rate distribution for each volume of interest (VOI). Parameters for treatment optimization such as dose rate volume histograms and dose rate statistic are provided. A patient example was used to illustrate our dosimetric calculations

  7. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  8. Hepatic iron overload: Quantitative MR imaging

    International Nuclear Information System (INIS)

    Gomori, J.M.; Horev, G.; Tamary, H.; Zandback, J.; Kornreich, L.; Zaizov, R.; Freud, E.; Krief, O.; Ben-Meir, J.; Rotem, H.

    1991-01-01

    Iron deposits demonstrate characteristically shortened T2 relaxation times. Several previously published studies reported poor correlation between the in vivo hepatic 1/T2 measurements made by means of midfield magnetic resonance (MR) units and the hepatic iron content of iron-overloaded patients. In this study, the authors assessed the use of in vivo 1/T2 measurements obtained by means of MR imaging at 0.5 T using short echo times (13.4 and 30 msec) and single-echo-sequences as well as computed tomographic (CT) attenuation as a measure of liver iron concentration in 10 severely iron-overloaded patients with beta-thalassemia major. The iron concentrations in surgical wedge biopsy samples of the liver, which varied between 3 and 9 mg/g of wet weight (normal, less than or equal to 0.5 mg/g), correlated well (r = .93, P less than or equal to .0001) with the preoperative in vivo hepatic 1/T2 measurements. The CT attenuation did not correlate with liver iron concentration. Quantitative MR imaging is a readily available noninvasive method for the assessment of hepatic iron concentration in iron-overloaded patients, reducing the need for needle biopsies of the liver

  9. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  10. A Quantitative Index of Forest Structural Sustainability

    Directory of Open Access Journals (Sweden)

    Jonathan A. Cale

    2014-07-01

    Full Text Available Forest health is a complex concept including many ecosystem functions, interactions and values. We develop a quantitative system applicable to many forest types to assess tree mortality with respect to stable forest structure and composition. We quantify impacts of observed tree mortality on structure by comparison to baseline mortality, and then develop a system that distinguishes between structurally stable and unstable forests. An empirical multivariate index of structural sustainability and a threshold value (70.6 derived from 22 nontropical tree species’ datasets differentiated structurally sustainable from unsustainable diameter distributions. Twelve of 22 species populations were sustainable with a mean score of 33.2 (median = 27.6. Ten species populations were unsustainable with a mean score of 142.6 (median = 130.1. Among them, Fagus grandifolia, Pinus lambertiana, P. ponderosa, and Nothofagus solandri were attributable to known disturbances; whereas the unsustainability of Abies balsamea, Acer rubrum, Calocedrus decurrens, Picea engelmannii, P. rubens, and Prunus serotina populations were not. This approach provides the ecological framework for rational management decisions using routine inventory data to objectively: determine scope and direction of change in structure and composition, assess excessive or insufficient mortality, compare disturbance impacts in time and space, and prioritize management needs and allocation of scarce resources.

  11. Quantitative Ultrasond in the assessment of Osteoporosis

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, Giuseppe [Department of Radiology, University of Foggia, Viale L. Pinto, 71100 Foggia (Italy); Department of Radiology, Scientific Institute Hospital, San Giovanni Rotondo (Italy)], E-mail: g.guglielmi@unifg.it; Terlizzi, Francesca de [IGEA srl, Via Parmenide 10/A 41012 Carpi, MO (Italy)], E-mail: f.deterlizzi@igeamedical.com

    2009-09-15

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  12. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  14. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  15. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail. Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  16. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail.

    Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  17. Quantitative assessment of integrated phrenic nerve activity.

    Science.gov (United States)

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  19. Quantitative biological measurement in Transmission Electron Tomography

    International Nuclear Information System (INIS)

    Mantell, Judith M; Verkade, Paul; Arkill, Kenton P

    2012-01-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  20. Quantitative biological measurement in Transmission Electron Tomography

    Science.gov (United States)

    Mantell, Judith M.; Verkade, Paul; Arkill, Kenton P.

    2012-07-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.