WorldWideScience

Sample records for hochaufgeloeste quantitative mr-tomografische

  1. Imaging of the lumbosacral plexus. Diagnostics and treatment planning with high-resolution procedures; Bildgebung des Plexus lumbosacralis. Diagnostik und Therapieplanung mithilfe hochaufgeloester Verfahren

    Energy Technology Data Exchange (ETDEWEB)

    Jengojan, S.; Schellen, C.; Bodner, G.; Kasprian, G. [Medizinische Universitaet Wien, Universitaetsklinik fuer Radiologie und Nuklearmedizin, Wien (Austria)

    2017-03-15

    Pathologien im Bereich des Plexus lumbosacralis und seiner peripheren Nervenaeste. Die hochaufloesende Ultraschallneurographie (HRUS) erlaubt insbesondere die Beurteilung oberflaechlich gelegener Strukturen des Plexus lumbosacralis. In Abhaengigkeit von der Erfahrung des Untersuchers koennen anatomische Verlaufsvarianten des N. ischiadicus (z. B. beim Piriformissyndrom) als auch subtilere Veraenderungen wie Neuritiden sonographisch dargestellt und erfasst werden. Mit der MRT sind v. a. tiefer gelegene Nervenstrukturen wie z. B. die Nn. pudendus und femoralis diagnostisch zugaenglich. Moderne MRT-Methoden wie die periphere Nerventraktographie ermoeglichen darueber hinaus auch eine dreidimensionale Darstellung der raeumlichen Beziehung zwischen Nerven und lokalen tumoroesen oder traumatischen Veraenderungen. Dies kann fuer die Therapieplanung hilfreich sein. Die Anatomie und Pathologie des Plexus lumbosacralis kann durch die sinnvolle Kombination von hochaufgeloester MRT- und Ultraschallneurographie zuverlaessig dargestellt werden. (orig.)

  2. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  3. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  4. Quantitative lithofacies palaeogeography

    Institute of Scientific and Technical Information of China (English)

    Zeng-Zhao; Feng; Xiu-Juan; Zheng; Zhi-Dong; Bao; Zhen-Kui; Jin; Sheng-He; Wu; You-Bin; He; Yong-Min; Peng; Yu-Qing; Yang; Jia-Qiang; Zhang; Yong-Sheng; Zhang

    2014-01-01

    Quantitative lithofacies palaeogeography is an important discipline of palaeogeography.It is developed on the foundation of traditional lithofacies palaeogeography and palaeogeography,the core of which is the quantitative lithofacies palaeogeographic map.Quantity means that in the palaeogeographic map,the division and identification of each palaeogeographic unit are supported by quantitative data and quantitative fundamental maps.Our lithofacies palaeogeographic maps are quantitative or mainly quantitative.A great number of quantitative lithofacies palaeogeographic maps have been published,and articles and monographs of quantitative lithofacies palaeogeography have been published successively,thus the quantitative lithofacies palaeogeography was formed and established.It is an important development in lithofacies palaeogeography.In composing quantitative lithofacies palaeogeographic maps,the key measure is the single factor analysis and multifactor comprehensive mapping method—methodology of quantitative lithofacies palaeogeography.In this paper,the authors utilize two case studies,one from the Early Ordovician of South China and the other from the Early Ordovician of Ordos,North China,to explain how to use this methodology to compose the quantitative lithofacies palaeogeographic maps,and to discuss the palaeogeographic units in these maps.Finally,three characteristics,i.e.,quantification,multiple orders and multiple types,of quantitative lithofacies palaeogeographic maps are conclusively discussed.

  5. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  6. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  7. Quantitative Autonomic Testing

    OpenAIRE

    Novak, Peter

    2011-01-01

    Disorders associated with dysfunction of autonomic nervous system are quite common yet frequently unrecognized. Quantitative autonomic testing can be invaluable tool for evaluation of these disorders, both in clinic and research. There are number of autonomic tests, however, only few were validated clinically or are quantitative. Here, fully quantitative and clinically validated protocol for testing of autonomic functions is presented. As a bare minimum the clinical autonomic laboratory shoul...

  8. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  9. Quantitative film radiography

    Energy Technology Data Exchange (ETDEWEB)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-02-26

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects.

  10. On Quantitative Rorschach Scales.

    Science.gov (United States)

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  11. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  12. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  13. Quantitative autonomic testing.

    Science.gov (United States)

    Novak, Peter

    2011-07-19

    Disorders associated with dysfunction of autonomic nervous system are quite common yet frequently unrecognized. Quantitative autonomic testing can be invaluable tool for evaluation of these disorders, both in clinic and research. There are number of autonomic tests, however, only few were validated clinically or are quantitative. Here, fully quantitative and clinically validated protocol for testing of autonomic functions is presented. As a bare minimum the clinical autonomic laboratory should have a tilt table, ECG monitor, continuous noninvasive blood pressure monitor, respiratory monitor and a mean for evaluation of sudomotor domain. The software for recording and evaluation of autonomic tests is critical for correct evaluation of data. The presented protocol evaluates 3 major autonomic domains: cardiovagal, adrenergic and sudomotor. The tests include deep breathing, Valsalva maneuver, head-up tilt, and quantitative sudomotor axon test (QSART). The severity and distribution of dysautonomia is quantitated using Composite Autonomic Severity Scores (CASS). Detailed protocol is provided highlighting essential aspects of testing with emphasis on proper data acquisition, obtaining the relevant parameters and unbiased evaluation of autonomic signals. The normative data and CASS algorithm for interpretation of results are provided as well.

  14. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  15. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  16. Quantitative Intracerebral Hemorrhage Localization

    Science.gov (United States)

    Muschelli, John; Ullman, Natalie L.; Sweeney, Elizabeth M.; Eloyan, Ani; Martin, Neil; Vespa, Paul; Hanley, Daniel F.; Crainiceanu, Ciprian M.

    2015-01-01

    Background and Purpose The location of intracerebral hemorrhage (ICH) is currently described in a qualitative way; we provide a quantitative framework for estimating ICH engagement and its relevance to stroke outcomes. Methods We analyzed 111 patients with ICH from the MISTIE II clinical trial. We estimated ICH engagement at a population level using image registration of CT scans to a template and a previously labeled atlas. Predictive regions of NIHSS and GCS stroke severity scores, collected at enrollment, were estimated. Results The percent coverage of the ICH by these regions strongly outperformed the reader-labeled locations. The adjusted R2 almost doubled from 0.129 (reader-labeled model) to 0.254 (quantitative-location model) for NIHSS and more than tripled from 0.069 (reader-labeled model) to 0.214 (quantitative-location model). A permutation test confirmed that the new predictive regions are more predictive than chance: p<.001 for NIHSS and p<.01 for GCS. Conclusions Objective measures of ICH location and engagement using advanced CT imaging processing provide finer, objective, and more quantitative anatomic information than that provided by human readers. PMID:26451031

  17. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  18. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  19. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  20. Energy & Climate: Getting Quantitative

    Science.gov (United States)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  1. Quantitation of signal transduction.

    Science.gov (United States)

    Krauss, S; Brand, M D

    2000-12-01

    Conventional qualitative approaches to signal transduction provide powerful ways to explore the architecture and function of signaling pathways. However, at the level of the complete system, they do not fully depict the interactions between signaling and metabolic pathways and fail to give a manageable overview of the complexity that is often a feature of cellular signal transduction. Here, we introduce a quantitative experimental approach to signal transduction that helps to overcome these difficulties. We present a quantitative analysis of signal transduction during early mitogen stimulation of lymphocytes, with steady-state respiration rate as a convenient marker of metabolic stimulation. First, by inhibiting various key signaling pathways, we measure their relative importance in regulating respiration. About 80% of the input signal is conveyed via identifiable routes: 50% through pathways sensitive to inhibitors of protein kinase C and MAP kinase and 30% through pathways sensitive to an inhibitor of calcineurin. Second, we quantify how each of these pathways differentially stimulates functional units of reactions that produce and consume a key intermediate in respiration: the mitochondrial membrane potential. Both the PKC and calcineurin routes stimulate consumption more strongly than production, whereas the unidentified signaling routes stimulate production more than consumption, leading to no change in membrane potential despite increased respiration rate. The approach allows a quantitative description of the relative importance of signal transduction pathways and the routes by which they activate a specific cellular process. It should be widely applicable.

  2. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  3. Directional and quantitative phosphorylation networks

    DEFF Research Database (Denmark)

    Jørgensen, Claus; Linding, Rune

    2008-01-01

    for unravelling phosphorylation-mediated cellular interaction networks. In particular, we will discuss how the combination of new quantitative mass-spectrometric technologies and computational algorithms together are enhancing mapping of these largely uncharted dynamic networks. By combining quantitative...

  4. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  5. Designing quantitative telemedicine research.

    Science.gov (United States)

    Wade, Victoria; Barnett, Adrian G; Martin-Khan, Melinda; Russell, Trevor

    2016-10-27

    When designing quantitative trials and evaluation of telehealth interventions, researchers should think ahead to the intended way that the intervention could be implemented in routine care and consider how trial participants with similar characteristics to the target population can be included. The telehealth intervention and the context in which it is placed should be clearly described, and consideration given to conducting pragmatic trials in order to show the effect of telehealth in complex environments with rapidly changing technology. Types of research designs, comparators and outcome measures are discussed and common statistical issues are introduced. © The Author(s) 2016.

  6. Quantitative immunoglobulins in adulthood.

    Science.gov (United States)

    Crisp, Howard C; Quinn, James M

    2009-01-01

    Although age-related changes in serum immunoglobulins are well described in childhood, alterations in immunoglobulins in the elderly are less well described and published. This study was designed to better define expected immunoglobulin ranges and differences in adults of differing decades of life. Sera from 404 patients, aged 20-89 years old were analyzed for quantitative immunoglobulin G (IgG), immunoglobulin M (IgM), and immunoglobulin A (IgA). The patients with diagnoses or medications known to affect immunoglobulin levels were identified while blinded to their immunoglobulin levels. A two-factor ANOVA was performed using decade of life and gender on both the entire sample population as well as the subset without any disease or medication expected to alter immunoglobulin levels. A literature review was also performed on all English language articles evaluating quantitative immunoglobulin levels in adults >60 years old. For the entire population, IgM was found to be higher in women when compared with men (p immunoglobulin levels, the differences in IgM with gender and age were maintained (p immunoglobulin levels have higher serum IgA levels and lower serum IgM levels. Women have higher IgM levels than men throughout life. IgG levels are not significantly altered in an older population.

  7. Is quantitative electromyography reliable?

    Science.gov (United States)

    Cecere, F; Ruf, S; Pancherz, H

    1996-01-01

    The reliability of quantitative electromyography (EMG) of the masticatory muscles was investigated in 14 subjects without any signs or symptoms of temporomandibular disorders. Integrated EMG activity from the anterior temporalis and masseter muscles was recorded bilaterally by means of bipolar surface electrodes during chewing and biting activities. In the first experiment, the influence of electrode relocation was investigated. No influence of electrode relocation on the recorded EMG signal could be detected. In a second experiment, three sessions of EMG recordings during five different chewing and biting activities were performed in the morning (I); 1 hour later without intermediate removal of the electrodes (II); and in the afternoon, using new electrodes (III). The method errors for different time intervals (I-II and I-III errors) for each muscle and each function were calculated. Depending on the time interval between the EMG recordings, the muscles considered, and the function performed, the individual errors ranged from 5% to 63%. The method error increased significantly (P masseter (mean 27.2%) was higher than for the temporalis (mean 20.0%). The largest function error was found during maximal biting in intercuspal position (mean 23.1%). Based on the findings, quantitative electromyography of the masticatory muscles seems to have a limited value in diagnostics and in the evaluation of individual treatment results.

  8. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  9. Quantitative velocity modulation spectroscopy

    Science.gov (United States)

    Hodges, James N.; McCall, Benjamin J.

    2016-05-01

    Velocity Modulation Spectroscopy (VMS) is arguably the most important development in the 20th century for spectroscopic study of molecular ions. For decades, interpretation of VMS lineshapes has presented challenges due to the intrinsic covariance of fit parameters including velocity modulation amplitude, linewidth, and intensity. This limitation has stifled the growth of this technique into the quantitative realm. In this work, we show that subtle changes in the lineshape can be used to help address this complexity. This allows for determination of the linewidth, intensity relative to other transitions, velocity modulation amplitude, and electric field strength in the positive column of a glow discharge. Additionally, we explain the large homogeneous component of the linewidth that has been previously described. Using this component, the ion mobility can be determined.

  10. Quantitative metamaterial property extraction

    CERN Document Server

    Schurig, David

    2015-01-01

    We examine an extraction model for metamaterials, not previously reported, that gives precise, quantitative and causal representation of S parameter data over a broad frequency range, up to frequencies where the free space wavelength is only a modest factor larger than the unit cell dimension. The model is comprised of superposed, slab shaped response regions of finite thickness, one for each observed resonance. The resonance dispersion is Lorentzian and thus strictly causal. This new model is compared with previous models for correctness likelihood, including an appropriate Occam's factor for each fit parameter. We find that this new model is by far the most likely to be correct in a Bayesian analysis of model fits to S parameter simulation data for several classic metamaterial unit cells.

  11. Quantitative Hyperspectral Reflectance Imaging

    Directory of Open Access Journals (Sweden)

    Ted A.G. Steemers

    2008-09-01

    Full Text Available Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared. By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms.

  12. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  13. Quantitative goals for monetary policy

    OpenAIRE

    Fatás, Antonio; Mihov, Ilian; ROSE, Andrew K.

    2006-01-01

    We study empirically the macroeconomic effects of an explicit de jure quantitative goal for monetary policy. Quantitative goals take three forms: exchange rates, money growth rates, and inflation targets. We analyze the effects on inflation of both having a quantitative target, and of hitting a declared target; we also consider effects on output volatility. Our empirical work uses an annual data set covering 42 countries between 1960 and 2000, and takes account of other determinants of inflat...

  14. Quantitative Risk - Phases 1 & 2

    Science.gov (United States)

    2013-11-12

    quantitative risk characterization”, " Risk characterization of microbiological hazards in food ", Chapter 4, 2009 314...State University, July 9, 2013 213. Albert I, Grenier E, Denis JB, Rousseau J., “ Quantitative Risk Assessment from Farm to Fork and Beyond: a...MELHEM, G., “Conduct Effective Quantitative Risk Assessment (QRA) Studies”, ioMosaic Corporation, 2006 233. Anderson, J., Brown, R., “ Risk

  15. Quantitative Electron Nanodiffraction.

    Energy Technology Data Exchange (ETDEWEB)

    Spence, John [Arizona State Univ., Mesa, AZ (United States)

    2015-01-30

    This Final report summarizes progress under this award for the final reporting period 2002 - 2013 in our development of quantitive electron nanodiffraction to materials problems, especially devoted to atomistic processes in semiconductors and electronic oxides such as the new artificial oxide multilayers, where our microdiffraction is complemented with energy-loss spectroscopy (ELNES) and aberration-corrected STEM imaging (9). The method has also been used to map out the chemical bonds in the important GaN semiconductor (1) used for solid state lighting, and to understand the effects of stacking sequence variations and interfaces in digital oxide superlattices (8). Other projects include the development of a laser-beam Zernike phase plate for cryo-electron microscopy (5) (based on the Kapitza-Dirac effect), work on reconstruction of molecular images using the scattering from many identical molecules lying in random orientations (4), a review article on space-group determination for the International Tables on Crystallography (10), the observation of energy-loss spectra with millivolt energy resolution and sub-nanometer spatial resolution from individual point defects in an alkali halide, a review article for the Centenary of X-ray Diffration (17) and the development of a new method of electron-beam lithography (12). We briefly summarize here the work on GaN, on oxide superlattice ELNES, and on lithography by STEM.

  16. Programmable Quantitative DNA Nanothermometers.

    Science.gov (United States)

    Gareau, David; Desrosiers, Arnaud; Vallée-Bélisle, Alexis

    2016-07-13

    Developing molecules, switches, probes or nanomaterials that are able to respond to specific temperature changes should prove of utility for several applications in nanotechnology. Here, we describe bioinspired strategies to design DNA thermoswitches with programmable linear response ranges that can provide either a precise ultrasensitive response over a desired, small temperature interval (±0.05 °C) or an extended linear response over a wide temperature range (e.g., from 25 to 90 °C). Using structural modifications or inexpensive DNA stabilizers, we show that we can tune the transition midpoints of DNA thermometers from 30 to 85 °C. Using multimeric switch architectures, we are able to create ultrasensitive thermometers that display large quantitative fluorescence gains within small temperature variation (e.g., > 700% over 10 °C). Using a combination of thermoswitches of different stabilities or a mix of stabilizers of various strengths, we can create extended thermometers that respond linearly up to 50 °C in temperature range. Here, we demonstrate the reversibility, robustness, and efficiency of these programmable DNA thermometers by monitoring temperature change inside individual wells during polymerase chain reactions. We discuss the potential applications of these programmable DNA thermoswitches in various nanotechnology fields including cell imaging, nanofluidics, nanomedecine, nanoelectronics, nanomaterial, and synthetic biology.

  17. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  18. Quantitative Literacy: Geosciences and Beyond

    Science.gov (United States)

    Richardson, R. M.; McCallum, W. G.

    2002-12-01

    Quantitative literacy seems like such a natural for the geosciences, right? The field has gone from its origin as a largely descriptive discipline to one where it is hard to imagine failing to bring a full range of mathematical tools to the solution of geological problems. Although there are many definitions of quantitative literacy, we have proposed one that is analogous to the UNESCO definition of conventional literacy: "A quantitatively literate person is one who, with understanding, can both read and represent quantitative information arising in his or her everyday life." Central to this definition is the concept that a curriculum for quantitative literacy must go beyond the basic ability to "read and write" mathematics and develop conceptual understanding. It is also critical that a curriculum for quantitative literacy be engaged with a context, be it everyday life, humanities, geoscience or other sciences, business, engineering, or technology. Thus, our definition works both within and outside the sciences. What role do geoscience faculty have in helping students become quantitatively literate? Is it our role, or that of the mathematicians? How does quantitative literacy vary between different scientific and engineering fields? Or between science and nonscience fields? We will argue that successful quantitative literacy curricula must be an across-the-curriculum responsibility. We will share examples of how quantitative literacy can be developed within a geoscience curriculum, beginning with introductory classes for nonmajors (using the Mauna Loa CO2 data set) through graduate courses in inverse theory (using singular value decomposition). We will highlight six approaches to across-the curriculum efforts from national models: collaboration between mathematics and other faculty; gateway testing; intensive instructional support; workshops for nonmathematics faculty; quantitative reasoning requirement; and individual initiative by nonmathematics faculty.

  19. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  20. Quantitative luminescence imaging system

    Science.gov (United States)

    Batishko, C. R.; Stahl, K. A.; Fecht, B. A.

    The goal of the Measurement of Chemiluminescence project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  1. Workshop on quantitative dynamic stratigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  2. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  3. Understanding quantitative research: part 2

    OpenAIRE

    Hoare, Z.; Hoe, J.

    2013-01-01

    This article, which is the second in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Understanding statistical analysis will ensure that nurses can assess the credibility and significance of the evidence reported. This article focuses on explaining common statistical terms and the presentation of statistical data in quantitative research.

  4. High temporal and high spatial resolution MR angiography (4D-MRA); Zeitlich und raeumlich hochaufgeloeste MR-Angiografie ('4D-MRA')

    Energy Technology Data Exchange (ETDEWEB)

    Hadizadeh, D.R.; Marx, C.; Gieseke, J.; Willinek, W.A. [Bonn Univ. (Germany). Radiology; Schild, H.H.

    2014-09-15

    In the first decade of the twenty-first century, whole-body magnetic resonance scanners with high field strengths (and thus potentially better signal-to-noise ratios) were developed. At the same time, parallel imaging and 'echo-sharing' techniques were refined to allow for increasingly high spatial and temporal resolution in dynamic magnetic resonance angiography ('time-resolved' = TR-MRA). This technological progress facilitated tracking the passage of intravenously administered contrast agent boluses as well as the acquisition of volume data sets at high image refresh rates ('4D-MRA'). This opened doors for many new applications in non-invasive vascular imaging, including simultaneous anatomic and functional analysis of many vascular pathologies including arterio-venous malformations. Different methods were established to acquire 4D-MRA using various strategies to acquire k-space trajectories over time in order to optimize imaging according to clinical needs. These include 'keyhole'-based techniques (e.g. 4D-TRAK), TRICKS - both with and without projection -and HYPR-reconstruction, TREAT, and TWIST. Some of these techniques were first introduced in the 1980s and 1990s, were later enhanced and modified, and finally implemented in the products of major vendors. In the last decade, a large number of studies on the clinical applications of TR-MRA was published. This manuscript provides an overview of the development of TR-MRA methods and the 4D-MRA techniques as they are currently used in the diagnosis, treatment and follow-up of vascular diseases in various parts of the body. (orig.)

  5. High spatial resolution radiation budget for Europe: derived from satellite data, validation of a regional model; Raeumlich hochaufgeloeste Strahlungsbilanz ueber Europa: Ableitung aus Satellitendaten, Validation eines regionalen Modells

    Energy Technology Data Exchange (ETDEWEB)

    Hollmann, R. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Atmosphaerenphysik

    2000-07-01

    Since forty years instruments onboard satellites have been demonstrated their usefulness for many applications in the field of meteorology and oceanography. Several experiments, like ERBE, are dedicated to establish a climatology of the global Earth radiation budget at the top of the atmosphere. Now the focus has been changed to the regional scale, e.g. GEWEX with its regional sub-experiments like BALTEX. To obtain a regional radiation budget for Europe in the first part of the work the well calibrated measurements from ScaRaB (scanner for radiation budget) are used to derive a narrow-to-broadband conversion, which is applicable to the AVHRR (advanced very high resolution radiometer). It is shown, that the accuracy of the method is in the order of that from SCaRaB itself. In the second part of the work, results of REMO have been compared with measurements of ScaRaB and AVHRR for March 1994. The model reproduces the measurements overall well, but it is overestimating the cold areas and underestimating the warm areas in the longwave spectral domain. Similarly it is overestimating the dark areas and underestimating the bright areas in the solar spectral domain. (orig.)

  6. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  7. Quantitative approaches in developmental biology.

    Science.gov (United States)

    Oates, Andrew C; Gorfinkiel, Nicole; González-Gaitán, Marcos; Heisenberg, Carl-Philipp

    2009-08-01

    The tissues of a developing embryo are simultaneously patterned, moved and differentiated according to an exchange of information between their constituent cells. We argue that these complex self-organizing phenomena can only be fully understood with quantitative mathematical frameworks that allow specific hypotheses to be formulated and tested. The quantitative and dynamic imaging of growing embryos at the molecular, cellular and tissue level is the key experimental advance required to achieve this interaction between theory and experiment. Here we describe how mathematical modelling has become an invaluable method to integrate quantitative biological information across temporal and spatial scales, serving to connect the activity of regulatory molecules with the morphological development of organisms.

  8. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  9. Quantitative vs qualitative research methods.

    Science.gov (United States)

    Lakshman, M; Sinha, L; Biswas, M; Charles, M; Arora, N K

    2000-05-01

    Quantitative methods have been widely used because of the fact that things that can be measured or counted gain scientific credibility over the unmeasurable. But the extent of biological abnormality, severity, consequences and the impact of illness cannot be satisfactorily captured and answered by the quantitative research alone. In such situations qualitative methods take a holistic perspective preserving the complexities of human behavior by addressing the "why" and "how" questions. In this paper an attempt has been made to highlight the strengths and weaknesses of both the methods and also that a balanced mix of both qualitative as well as quantitative methods yield the most valid and reliable results.

  10. Developing Geoscience Students' Quantitative Skills

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the

  11. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  12. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  13. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    2016-01-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry—especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644965

  14. Quantitative Proteome Mapping of Nitrotyrosines

    Energy Technology Data Exchange (ETDEWEB)

    Bigelow, Diana J.; Qian, Weijun

    2008-02-10

    An essential first step in the understanding disease and environmental perturbations is the early and quantitative detection of the increased levels of the inflammatory marker nitrotyrosine, as compared with its endogenous levels within the tissue or cellular proteome. Thus, methods that successfully address a proteome-wide quantitation of nitrotyrosine and related oxidative modifications can provide early biomarkers of risk and progression of disease as well as effective strategies for therapy. Multidimensional separations LC coupled with tandem mass spectrometry (LC-MS/MS) has, in recent years, significantly expanded our knowledge of human (and mammalian model system) proteomes including some nascent work in identification of post-translational modifications. In the following review, we discuss the application of LC-MS/MS for quantitation and identification of nitrotyrosine-modified proteins within the context of complex protein mixtures presented in mammalian proteomes.

  15. Semi-Quantitative Group Testing

    CERN Document Server

    Emad, Amin

    2012-01-01

    We consider a novel group testing procedure, termed semi-quantitative group testing, motivated by a class of problems arising in genome sequence processing. Semi-quantitative group testing (SQGT) is a non-binary pooling scheme that may be viewed as a combination of an adder model followed by a quantizer. For the new testing scheme we define the capacity and evaluate the capacity for some special choices of parameters using information theoretic methods. We also define a new class of disjunct codes suitable for SQGT, termed SQ-disjunct codes. We also provide both explicit and probabilistic code construction methods for SQGT with simple decoding algorithms.

  16. Quantitative two-qutrit entanglement

    Energy Technology Data Exchange (ETDEWEB)

    Eltschka, Christopher [Institut fuer Theoretische Physik, Universitaet Regensburg, D-93040 Regensburg (Germany); Siewert, Jens [Departamento de Quimica Fisica, Universidad del Pais Vasco UPV/EHU, 48080 Bilbao (Spain); IKERBASQUE, Basque Foundation for Science, 48011 Bilbao (Spain)

    2013-07-01

    We introduce the new concept of axisymmetric bipartite states. For d x d-dimensional systems these states form a two-parameter family of nontrivial mixed states that include the isotropic states. We present exact quantitative results for class-specific entanglement as well as for the negativity and I-concurrence of two-qutrit axisymmetric states. These results have interesting applications such as for quantitative witnesses of class-specific entanglement in arbitrary two-qutrit states and as device-independent witness for the number of entangled dimensions.

  17. When is Quantitative Easing effective?

    OpenAIRE

    Hoermann, Markus; Schabert, Andreas

    2011-01-01

    We present a simple macroeconomic model with open market operations that allows examining the effects of quantitative and credit easing. The central bank controls the policy rate, i.e. the price of money in open market operations, as well as the amount and the type of assets that are accepted as collateral for money. When the policy rate is sufficiently low, this set-up gives rise to an (il-)liquidity premium on non-eligible assets. Then, a quantitative easing policy, which increases the size...

  18. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  19. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  20. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  1. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  2. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  3. Quantitative risk assessment of CO

    NARCIS (Netherlands)

    Koornneef, J.; Spruijt, M.; Molag, M.; Ramírez, A.; Turkenburg, W.; Faaij, A.

    2010-01-01

    A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO2 pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperat

  4. Can we quantitatively assess security?

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.

    2006-01-01

    This short note describes a number of methods for assessing security in a quantitative way. Next to describing a five existing approaches (where no completeness is claimed), a new assessment technique is proposed, that finds its roots in methods known from performability evaluation and stochastic mo

  5. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  6. Quantitative disease resistance and quantitative resistance Loci in breeding.

    Science.gov (United States)

    St Clair, Dina A

    2010-01-01

    Quantitative disease resistance (QDR) has been observed within many crop plants but is not as well understood as qualitative (monogenic) disease resistance and has not been used as extensively in breeding. Mapping quantitative trait loci (QTLs) is a powerful tool for genetic dissection of QDR. DNA markers tightly linked to quantitative resistance loci (QRLs) controlling QDR can be used for marker-assisted selection (MAS) to incorporate these valuable traits. QDR confers a reduction, rather than lack, of disease and has diverse biological and molecular bases as revealed by cloning of QRLs and identification of the candidate gene(s) underlying QRLs. Increasing our biological knowledge of QDR and QRLs will enhance understanding of how QDR differs from qualitative resistance and provide the necessary information to better deploy these resources in breeding. Application of MAS for QRLs in breeding for QDR to diverse pathogens is illustrated by examples from wheat, barley, common bean, tomato, and pepper. Strategies for optimum deployment of QRLs require research to understand effects of QDR on pathogen populations over time.

  7. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Abstract. Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  8. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  9. Quantitative wave-particle duality

    Science.gov (United States)

    Qureshi, Tabish

    2016-07-01

    The complementary wave and particle character of quantum objects (or quantons) was pointed out by Niels Bohr. This wave-particle duality, in the context of the two-slit experiment, is here described not just as two extreme cases of wave and particle characteristics, but in terms of quantitative measures of these characteristics, known to follow a duality relation. A very simple and intuitive derivation of a closely related duality relation is presented, which should be understandable to the introductory student.

  10. Quantitative spectroscopy of hot stars

    Science.gov (United States)

    Kudritzki, R. P.; Hummer, D. G.

    1990-01-01

    A review on the quantitative spectroscopy (QS) of hot stars is presented, with particular attention given to the study of photospheres, optically thin winds, unified model atmospheres, and stars with optically thick winds. It is concluded that the results presented here demonstrate the reliability of Qs as a unique source of accurate values of the global parameters (effective temperature, surface gravity, and elemental abundances) of hot stars.

  11. Quantitative measures for redox signaling.

    Science.gov (United States)

    Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M

    2016-07-01

    Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Quantitative characterisation of sedimentary grains

    Science.gov (United States)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  13. Quantitative analysis of glycated proteins.

    Science.gov (United States)

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  14. Non-manipulation quantitative designs.

    Science.gov (United States)

    Rumrill, Phillip D

    2004-01-01

    The article describes non-manipulation quantitative designs of two types, correlational and causal comparative studies. Both of these designs are characterized by the absence of random assignment of research participants to conditions or groups and non-manipulation of the independent variable. Without random selection or manipulation of the independent variable, no attempt is made to draw causal inferences regarding relationships between independent and dependent variables. Nonetheless, non-manipulation studies play an important role in rehabilitation research, as described in this article. Examples from the contemporary rehabilitation literature are included. Copyright 2004 IOS Press

  15. Quantitative relationships in delphinid neocortex

    DEFF Research Database (Denmark)

    Mortensen, Heidi S.; Pakkenberg, Bente; Dam, Maria

    2014-01-01

    total number of brain cells in cetaceans, and even fewer have used unbiased counting methods. In this study, using stereological methods, we estimated the total number of cells in the neocortex of the long-finned pilot whale (Globicephala melas) brain. For the first time, we show that a species...... density in long-finned pilot whales is lower than that in humans, their higher cell number appears to be due to their larger brain. Accordingly, our findings make an important contribution to the ongoing debate over quantitative relationships in the mammalian brain....

  16. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  17. Strategies for quantitation of phosphoproteomic data

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Thingholm, Tine Engberg

    2010-01-01

    Recent developments in phosphoproteomic sample-preparation techniques and sensitive mass spectrometry instrumentation have led to large-scale identifications of phosphoproteins and phosphorylation sites from highly complex samples. This has facilitated the implementation of different quantitation...... will be on different quantitation strategies. Methods for metabolic labeling, chemical modification and label-free quantitation and their applicability or inapplicability in phosphoproteomic studies are discussed....

  18. Quantitative imaging with fluorescent biosensors.

    Science.gov (United States)

    Okumoto, Sakiko; Jones, Alexander; Frommer, Wolf B

    2012-01-01

    Molecular activities are highly dynamic and can occur locally in subcellular domains or compartments. Neighboring cells in the same tissue can exist in different states. Therefore, quantitative information on the cellular and subcellular dynamics of ions, signaling molecules, and metabolites is critical for functional understanding of organisms. Mass spectrometry is generally used for monitoring ions and metabolites; however, its temporal and spatial resolution are limited. Fluorescent proteins have revolutionized many areas of biology-e.g., fluorescent proteins can report on gene expression or protein localization in real time-yet promoter-based reporters are often slow to report physiologically relevant changes such as calcium oscillations. Therefore, novel tools are required that can be deployed in specific cells and targeted to subcellular compartments in order to quantify target molecule dynamics directly. We require tools that can measure enzyme activities, protein dynamics, and biophysical processes (e.g., membrane potential or molecular tension) with subcellular resolution. Today, we have an extensive suite of tools at our disposal to address these challenges, including translocation sensors, fluorescence-intensity sensors, and Förster resonance energy transfer sensors. This review summarizes sensor design principles, provides a database of sensors for more than 70 different analytes/processes, and gives examples of applications in quantitative live cell imaging.

  19. Digital radiography: a quantitative approach

    Energy Technology Data Exchange (ETDEWEB)

    Retraint, F. [Universite de Technologie de Troyes, Troyes (France)

    2004-07-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  20. Quantitative ultrasonic phased array imaging

    Science.gov (United States)

    Engle, Brady J.; Schmerr, Lester W., Jr.; Sedov, Alexander

    2014-02-01

    When imaging with ultrasonic phased arrays, what do we actually image? What quantitative information is contained in the image? Ad-hoc delay-and-sum methods such as the synthetic aperture focusing technique (SAFT) and the total focusing method (TFM) fail to answer these questions. We have shown that a new quantitative approach allows the formation of flaw images by explicitly inverting the Thompson-Gray measurement model. To examine the above questions, we have set up a software simulation test bed that considers a 2-D scalar scattering problem of a cylindrical inclusion with the method of separation of variables. It is shown that in SAFT types of imaging the only part of the flaw properly imaged is the front surface specular response of the flaw. Other responses (back surface reflections, creeping waves, etc.) are improperly imaged and form artifacts in the image. In the case of TFM-like imaging the quantity being properly imaged is an angular integration of the front surface reflectivity. The other, improperly imaged responses are also averaged, leading to a reduction in some of the artifacts present. Our results have strong implications for flaw sizing and flaw characterization with delay-and-sum images.

  1. Quantitative Analysis of Face Symmetry.

    Science.gov (United States)

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  2. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  3. Applications of microfluidics in quantitative biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2017-10-04

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, we present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. This article is protected by copyright. All rights reserved.

  4. Towards Quantitative Ocean Precipitation Validation

    Science.gov (United States)

    Klepp, C.; Bakan, S.; Andersson, A.

    2009-04-01

    A thorough knowledge of global ocean precipitation is an indispensable prerequisite for the understanding and successful modelling of the global climate system as it is an important component of the water cycle. However, reliable detection of quantitative precipitation over the global oceans, especially at high latitudes during the cold season remains a challenging task for remote sensing and model based estimates. Quantitative ship validation data using reliable instruments for measuring rain and snowfall hardly exist but are highly demanded for ground validation of such products. The satellite based HOAPS (Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite Data) climatology contains fields of precipitation, evaporation and the resulting freshwater flux along with 12 additional atmospheric parameters over the global ice-free ocean between 1987 and 2005. Except for the NOAA Pathfinder SST, all basic state variables are calculated from SSM/I passive microwave radiometer measurements. HOAPS contains three main data subsets that originate from one common pixel-level data source. Gridded 0.5 degree monthly, pentad and twice daily data products are freely available from www.hoaps.org. Especially for North Atlantic mid-latitude mix-phase precipitation, the HOAPS precipitation retrieval has been investigated in some depth. This analysis revealed that the HOAPS retrieval qualitatively well represents cyclonic and intense mesoscale precipitation in agreement with ship observations and Cloudsat data, while GPCP, ECMWF forecast, ERA-40 and regional model data miss mesoscale precipitation substantially. As the differences between the investigated data sets are already large under mix-phase precipitation conditions, further work is carried out on snowfall validation during the cold season at high-latitudes. A Norwegian Sea field campaign in winter 2005 was carried out using an optical disdrometer capable of measuring quantitative amounts of snowfall over the ocean

  5. Quantitative patterns in drone wars

    Science.gov (United States)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  6. Quantitative analysis of Boehm's GC

    Institute of Scientific and Technical Information of China (English)

    GUAN Xue-tao; ZHANG Yuan-rui; GOU Xiao-gang; CHENG Xu

    2003-01-01

    The term garbage collection describes the automated process of finding previously allocated memorythatis no longer in use in order to make the memory available to satisfy subsequent allocation requests. Wehave reviewed existing papers and implementations of GC, and especially analyzed Boehm' s C codes, which isa real-time mark-sweep GC running under Linux and ANSI C standard. In this paper, we will quantitatively an-alyze the performance of different configurations of Boehm' s collector subjected to different workloads. Reportedmeasurements demonstrate that a refined garbage collector is a viable alternative to traditional explicit memorymanagement techniques, even for low-level languages. It is more a trade-off for certain system than an all-or-nothing proposition.

  7. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics.

  8. Qualitative and Quantitative Sentiment Proxies

    DEFF Research Database (Denmark)

    Zhao, Zeyan; Ahmad, Khurshid

    2015-01-01

    Sentiment analysis is a content-analytic investigative framework for researchers, traders and the general public involved in financial markets. This analysis is based on carefully sourced and elaborately constructed proxies for market sentiment and has emerged as a basis for analysing movements...... in stock prices and the associated traded volume. This approach is particularly helpful just before and after the onset of market volatility. We use an autoregressive framework for predicting the overall changes in stock prices by using investor sentiment together with lagged variables of prices...... and trading volumes. The case study we use is a small market index (Danish Stock Exchange Index, OMXC 20, together with prevailing sentiment in Denmark, to evaluate the impact of sentiment on OMXC 20. Furthermore, we introduce a rather novel and quantitative sentiment proxy, that is the use of the index...

  9. Quantitative photoacoustic elastography in humans

    Science.gov (United States)

    Hai, Pengfei; Zhou, Yong; Gong, Lei; Wang, Lihong V.

    2016-06-01

    We report quantitative photoacoustic elastography (QPAE) capable of measuring Young's modulus of biological tissue in vivo in humans. By combining conventional PAE with a stress sensor having known stress-strain behavior, QPAE can simultaneously measure strain and stress, from which Young's modulus is calculated. We first demonstrate the feasibility of QPAE in agar phantoms with different concentrations. The measured Young's modulus values fit well with both the empirical expectation based on the agar concentrations and those measured in an independent standard compression test. Next, QPAE was applied to quantify the Young's modulus of skeletal muscle in vivo in humans, showing a linear relationship between muscle stiffness and loading. The results demonstrated the capability of QPAE to assess the absolute elasticity of biological tissue noninvasively in vivo in humans, indicating its potential for tissue biomechanics studies and clinical applications.

  10. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  11. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  12. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.;

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...... characterisation methods, such as fractals, wavelets, change trees and others, including for each method a short review, the parameters that the new methods calculate, and applications of the methods to solve surface problems. The paper contains a discussion on the relevance of the different parameters...... and quantification methods in terms of functional correlations, and it addresses the need for reducing the large number of existing parameters. The review considers the present situation and gives suggestions for future activities....

  13. Quantitative evaluation of dermatological antiseptics.

    Science.gov (United States)

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus.

  14. Quantitative measurements of inventory control.

    Science.gov (United States)

    Noel, M W

    1984-11-01

    The use of quantitative measurements for improving inventory management efficiency in hospital pharmacy is reviewed. Proper management of the pharmacy inventory affects the financial operation of the entire hospital. Problems associated with maintaining inadequate or excessive inventory investment are discussed, and the use of inventory valuation and turnover rate for assessing inventory control efficiency is described. Frequency of order placement has an important effect on inventory turnover, carrying costs, and ordering costs. Use of the ABC system of inventory classification for identifying products constituting the majority of inventory dollar investment is outlined, and the economic order value concept is explained. With increasing regulations aimed at controlling hospital costs, pharmacy managers must seek every possible means to improve efficiency. Reducing the amount of money obligated to inventory can substantially improve the financial position of the hospital without requiring a reduction in personnel or quality of service.

  15. GPC and quantitative phase imaging

    Science.gov (United States)

    Palima, Darwin; Bañas, Andrew Rafael; Villangca, Mark Jayson; Glückstad, Jesper

    2016-03-01

    Generalized Phase Contrast (GPC) is a light efficient method for generating speckle-free contiguous optical distributions using binary-only or analog phase levels. It has been used in applications such as optical trapping and manipulation, active microscopy, structured illumination, optical security, parallel laser marking and labelling and recently in contemporary biophotonics applications such as for adaptive and parallel two-photon optogenetics and neurophotonics. We will present our most recent GPC developments geared towards these applications. We first show a very compact static light shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging (QPI).

  16. Quantitative computation of RHEED patterns

    Science.gov (United States)

    Lordi, Scott Andrew

    This thesis is concerned with the general problem of performing quantitative RHEED computations for both flat and rough surfaces using the multislice method. Modifications to the RHEED multislice method which make it into a practical technique for performing RHEED computations are described. Computation of convergent-beam RHEED patterns using the RHEED multislice method is demonstrated by application to the case of MgO (001). Computed patterns are compared (based on the overall pattern geometry) to energy-filtered Tanaka and convergent-beam patterns recorded in a transmission electron microscope. The validity of the RHEED multislice method for convergent-beam computations is demonstrated by the level of agreement achieved. The application of the RHEED multislice method combined with the edge patching algorithm to the computation of RHEED streaks from rough surfaces is demonstrated by applying it to the case of rough Fe (001) surfaces. The computations are done using the column approximation and by neglecting the scattering from steps parallel to the incident beam. The computations are set up using STM images of the surfaces from which the experimental RHEED patterns were recorded. The shapes of the diffuse parts of the computed and experimental streaks agree well. There is a discrepancy between experiment and theory in the magnitudes of the flat surface spot position peaks relative to the diffuse parts of the streaks. The shapes of the diffuse parts of the computed streaks are shown to be insensitive to the computational and potential parameters. The magnitudes of the flat surface spot position peaks are at least weakly dependent on the potential parameters and the long range height variations of the surface. This agreement conclusively demonstrates that the RHEED multislice method can be used to perform quantitative computations of RHEED streaks from real rough surfaces. An approximation method (the patchwork approximation) for doing RHEED computations, exact in

  17. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  18. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  19. Quantitative Ultrasound Measurements at the Heel

    DEFF Research Database (Denmark)

    Daugschies, M.; Brixen, K.; Hermann, P.

    2015-01-01

    Calcaneal quantitative ultrasound can be used to predict osteoporotic fracture risk, but its ability to monitor therapy is unclear possibly because of its limited precision. We developed a quantitative ultrasound device (foot ultrasound scanner) that measures the speed of sound at the heel...... with the foot ultrasound scanner reduced precision errors by half (p quantitative ultrasound measurements is feasible. (E-mail: m.daugschies@rad.uni-kiel.de) (C) 2015 World Federation for Ultrasound in Medicine & Biology....

  20. Advances in quantitative Kerr microscopy

    Science.gov (United States)

    Soldatov, I. V.; Schäfer, R.

    2017-01-01

    An advanced wide-field Kerr microscopy approach to the vector imaging of magnetic domains is demonstrated. Utilizing the light from eight monochrome light emitting diodes, guided to the microscope by glass fibers, and being properly switched in synchronization with the camera exposure, domain images with orthogonal in-plane sensitivity are obtained simultaneously at real time. After calibrating the Kerr contrast under the same orthogonal sensitivity conditions, the magnetization vector field of complete magnetization cycles along the hysteresis loop can be calculated and plotted as a coded color or vector image. In the pulsed mode also parasitic, magnetic field-dependent Faraday rotations in the microscope optics are eliminated, thus increasing the accuracy of the measured magnetization angles to better than 5∘. The method is applied to the investigation of the magnetization process in a patterned Permalloy film element. Furthermore it is shown that the effective magnetic anisotropy axes in a GaMnAs semiconducting film can be quantitatively measured by vectorial analysis of the domain structure.

  1. A quantitative philology of introspection

    Directory of Open Access Journals (Sweden)

    Carlos eDiuk

    2012-09-01

    Full Text Available The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the ``Axial Age'', saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy - which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single ``arrow of time'' in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the 20th century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus.

  2. Quantitative relationships in delphinid neocortex

    Directory of Open Access Journals (Sweden)

    Heidi S Mortensen

    2014-11-01

    Full Text Available Possessing large brains and complex behavioural patterns, cetaceans are believed to be highly intelligent. Their brains, which are the largest in the Animal Kingdom and have enormous gyrification compared with terrestrial mammals, have long been of scientific interest. Few studies, however, report total number of brain cells in cetaceans, and even fewer have used unbiased counting methods. In this study, using stereological methods, we estimated the total number of cells in the long-finned pilot whale (Globicephala melas brain. For the first time, we show that a species of dolphin has more neocortical neurons than in any mammal studied to date including humans. These cell numbers are compared across various mammals with different brain sizes, and the function of possessing many neurons is discussed. We found that the long-finned pilot whale neocortex has approximately 37.2 × 109 neurons, which is almost twice as many as humans, and 127 × 109 glial cells. Thus, the absolute number of neurons in the human neocortex is not correlated with the superior cognitive abilities of humans (at least compared to cetaceans as has previously been hypothesized. However, as neuron density in long-finned pilot whales is lower than that in humans, their higher cell number appears to be due to their larger brain. Accordingly, our findings make an important contribution to the ongoing debate over quantitative relationships in the mammalian brain.

  3. Quantitative ultrasound in cancer imaging.

    Science.gov (United States)

    Feleppa, Ernest J; Mamou, Jonathan; Porter, Christopher R; Machi, Junji

    2011-02-01

    Ultrasound is a relatively inexpensive, portable, and versatile imaging modality that has a broad range of clinical uses. It incorporates many imaging modes, such as conventional gray-scale "B-mode" imaging to display echo amplitude in a scanned plane; M-mode imaging to track motion at a given fixed location over time; duplex, color, and power Doppler imaging to display motion in a scanned plane; harmonic imaging to display nonlinear responses to incident ultrasound; elastographic imaging to display relative tissue stiffness; and contrast-agent imaging with simple contrast agents to display blood-filled spaces or with targeted agents to display specific agent-binding tissue types. These imaging modes have been well described in the scientific, engineering, and clinical literature. A less well-known ultrasonic imaging technology is based on quantitative ultrasound (QUS), which analyzes the distribution of power as a function of frequency in the original received echo signals from tissue and exploits the resulting spectral parameters to characterize and distinguish among tissues. This article discusses the attributes of QUS-based methods for imaging cancers and providing improved means of detecting and assessing tumors. The discussion will include applications to imaging primary prostate cancer and metastatic cancer in lymph nodes to illustrate the methods. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  5. Pitfalls of Quantitative Surveys Online

    Directory of Open Access Journals (Sweden)

    Iva Pecáková

    2016-12-01

    Full Text Available With the development of the Internet in the last two decades, its use in all phases of field survey is growing very quickly. Indeed, it reduces costs while allowing exploration of relatively large files and enables effective use of a variety of research tools. The academic research is more reserved towards developing online surveys. Demands on the quality of data are the main cause; Internet surveys do not meet them and thus do not allow drawing objective conclusion about the populations surveyed. Unqualified use of the Internet may significantly influence data and information obtained from their analysis. The problematic definition of the population that is under investigation may result in a fault of its coverage. Its existence can be shown, for example, on a confrontation of the total and Internet population of the Czech Republic, the total and Internet population of the Czech households, etc. Representation of the population through an online panel may cause bias, depending on how the panel is created. A relatively new source of error in an online survey is the existence of “professional” respondents. The sampling method from a population or an online panel can lead to the emergence of such a sample that is not representative and does not allow inference to the population at all, or only in a very limited way. Even probability sampling, however, can be problematic if it is affected by a higher rate of non-responses. The aim of this paper is to summarise the possible sources of bias associated with any sample survey, but also to draw attention to those that are relatively new and are associated with the implementation of just quantitative surveys online.

  6. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  7. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    Science.gov (United States)

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all pquantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  8. Quantitative Relationships Involving Additive Differences: Numerical Resilience

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2014-01-01

    This case study describes the ways in which problems involving additive differences with unknown starting quantities, constrain the problem solver in articulating the inherent quantitative relationship. It gives empirical evidence to show how numerical reasoning takes over as a Grade 6 student instantiates the quantitative relation by resorting to…

  9. Development and Measurement of Preschoolers' Quantitative Knowledge

    Science.gov (United States)

    Geary, David C.

    2015-01-01

    The collection of studies in this special issue make an important contribution to our understanding and measurement of the core cognitive and noncognitive factors that influence children's emerging quantitative competencies. The studies also illustrate how the field has matured, from a time when the quantitative competencies of infants and young…

  10. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye mov

  11. Evaluating quantitative research designs: Part 1.

    Science.gov (United States)

    Haughey, B P

    1994-10-01

    This article has provided an overview of the three major types of quantitative designs commonly used in nursing research, as well as some criteria for evaluating the designs of published research. The next column will include additional criteria for critiquing quantitative research designs.

  12. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  13. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye

  14. Applying Knowledge of Quantitative Design and Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  15. Quantitative Robust Control Engineering: Theory and Applications

    Science.gov (United States)

    2006-09-01

    1992). Discrete quantitative feedback technique, Capítulo 16 en el libro : Digital Control Systems: theory, hardware, software, 2ª edicion. McGraw...Rasmussen S.J., Garcia-Sanz, M. (2001, 2005), Software de diseño del libro Quantitative Feedback Theory: Fundamentals and Applications. Edición 2ª. CRCPress

  16. Development of Quantitative electron nano-diffraction

    NARCIS (Netherlands)

    Kumar, V.

    2009-01-01

    This thesis is a step towards development of quantitative parallel beam electron nano-diffraction (PBED). It is focused on the superstructure determination of zig-zag and zig-zig NaxCoO2 and analysis of charge distribution in the two polymorphs Nb12O29 using PBED. It has been shown that quantitative

  17. Challenges and perspectives in quantitative NMR.

    Science.gov (United States)

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  19. Quantitative and qualitative research: beyond the debate.

    Science.gov (United States)

    Gelo, Omar; Braakmann, Diana; Benetka, Gerhard

    2008-09-01

    Psychology has been a highly quantitative field since its conception as a science. However, a qualitative approach to psychological research has gained increasing importance in the last decades, and an enduring debate between quantitative and qualitative approaches has arisen. The recently developed Mixed Methods Research (MMR) addresses this debate by aiming to integrate quantitative and qualitative approaches. This article outlines and discusses quantitative, qualitative and mixed methods research approaches with specific reference to their (1) philosophical foundations (i.e. basic sets of beliefs that ground inquiry), (2) methodological assumptions (i.e. principles and formal conditions which guide scientific investigation), and (3) research methods (i.e. concrete procedures for data collection, analysis and interpretation). We conclude that MMR may reasonably overcome the limitation of purely quantitative and purely qualitative approaches at each of these levels, providing a fruitful context for a more comprehensive psychological research.

  20. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome

    Science.gov (United States)

    2016-01-01

    Objective To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Methods Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Results Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). Conclusion These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US. PMID:28119835

  1. Interpretation of Quantitative Shotgun Proteomic Data.

    Science.gov (United States)

    Aasebø, Elise; Berven, Frode S; Selheim, Frode; Barsnes, Harald; Vaudel, Marc

    2016-01-01

    In quantitative proteomics, large lists of identified and quantified proteins are used to answer biological questions in a systemic approach. However, working with such extensive datasets can be challenging, especially when complex experimental designs are involved. Here, we demonstrate how to post-process large quantitative datasets, detect proteins of interest, and annotate the data with biological knowledge. The protocol presented can be achieved without advanced computational knowledge thanks to the user-friendly Perseus interface (available from the MaxQuant website, www.maxquant.org ). Various visualization techniques facilitating the interpretation of quantitative results in complex biological systems are also highlighted.

  2. The mathematics of cancer: integrating quantitative models.

    Science.gov (United States)

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  3. Quantitative regularities in floodplain formation

    Science.gov (United States)

    Nevidimova, O.

    2009-04-01

    Quantitative regularities in floodplain formation Modern methods of the theory of complex systems allow to build mathematical models of complex systems where self-organizing processes are largely determined by nonlinear effects and feedback. However, there exist some factors that exert significant influence on the dynamics of geomorphosystems, but hardly can be adequately expressed in the language of mathematical models. Conceptual modeling allows us to overcome this difficulty. It is based on the methods of synergetic, which, together with the theory of dynamic systems and classical geomorphology, enable to display the dynamics of geomorphological systems. The most adequate for mathematical modeling of complex systems is the concept of model dynamics based on equilibrium. This concept is based on dynamic equilibrium, the tendency to which is observed in the evolution of all geomorphosystems. As an objective law, it is revealed in the evolution of fluvial relief in general, and in river channel processes in particular, demonstrating the ability of these systems to self-organization. Channel process is expressed in the formation of river reaches, rifts, meanders and floodplain. As floodplain is a periodically flooded surface during high waters, it naturally connects river channel with slopes, being one of boundary expressions of the water stream activity. Floodplain dynamics is inseparable from the channel dynamics. It is formed at simultaneous horizontal and vertical displacement of the river channel, that is at Y=Y(x, y), where х, y - horizontal and vertical coordinates, Y - floodplain height. When dу/dt=0 (for not lowering river channel), the river, being displaced in a horizontal plane, leaves behind a low surface, which flooding during high waters (total duration of flooding) changes from the maximum during the initial moment of time t0 to zero in the moment tn. In a similar manner changed is the total amount of accumulated material on the floodplain surface

  4. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  5. Electronic Noses Using Quantitative Artificial Neural Networ

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The present paper covers a new type of electronic nose(e-nose) with a four-sensor array,which has been applied to detecting gases quantitatively in the presence of interference. This e-nose has adapted fundamental aspects of relative error(RE) in changing quantitative analysis into the artificial neural network (ANN).. Thus, both the quantitative and the qualitative requirements for ANN in implementing e-nose can be satisfied. In addition, the e-nose uses only 4 sensors in the sensor array, and can be designed for different usages simply by changing one or two sensor(s). Various gases were tested by this kind of e-nose, including alcohol vapor, CO, liquefied-petrol-gas and CO2. Satisfactory quantitative results were obtained and no qualitative mistake in prediction was observed for the samples being mixed with interference gases.

  6. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  7. Quantitative genetic studies of antisocial behaviour.

    Science.gov (United States)

    Viding, Essi; Larsson, Henrik; Jones, Alice P

    2008-08-12

    This paper will broadly review the currently available twin and adoption data on antisocial behaviour (AB). It is argued that quantitative genetic research can make a significant contribution to further the understanding of how AB develops. Genetically informative study designs are particularly useful for investigating several important questions such as whether: the heritability estimates vary as a function of assessment method or gender; the relative importance of genetic and environmental influences varies for different types of AB; the environmental risk factors are truly environmental; and genetic vulnerability influences susceptibility to environmental risk. While the current data are not yet directly translatable for prevention and treatment programmes, quantitative genetic research has concrete translational potential. Quantitative genetic research can supplement neuroscience research in informing about different subtypes of AB, such as AB coupled with callous-unemotional traits. Quantitative genetic research is also important in advancing the understanding of the mechanisms by which environmental risk operates.

  8. Report on Solar Water Heating Quantitative Survey

    Energy Technology Data Exchange (ETDEWEB)

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  9. Quantitative roughness measurements with iTIRM

    NARCIS (Netherlands)

    Bijl, R.J.M. van der; Fähnle, O.W.; Brug, H. van; Braat, J.J.M.

    2000-01-01

    A new method, iTIRM, is used for quantitative surface roughness measurements of ground and polished surfaces and it is shown to be a useful tool for measuring total surface quality instead of individual roughness parameters.

  10. Curriculum, quantitative concepts and methodology of teaching ...

    African Journals Online (AJOL)

    Curriculum, quantitative concepts and methodology of teaching children with learning difficulties. ... African Journal of Educational Studies in Mathematics and Sciences. Journal Home · ABOUT ... Open Access DOWNLOAD FULL TEXT ...

  11. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  12. The origins and structure of quantitative concepts.

    Science.gov (United States)

    Bonn, Cory D; Cantlon, Jessica F

    2012-01-01

    "Number" is the single most influential quantitative dimension in modern human society. It is our preferred dimension for keeping track of almost everything, including distance, weight, time, temperature, and value. How did "number" become psychologically affiliated with all of these different quantitative dimensions? Humans and other animals process a broad range of quantitative information across many psychophysical dimensions and sensory modalities. The fact that adults can rapidly translate one dimension (e.g., loudness) into any other (e.g., handgrip pressure) has been long established by psychophysics research (Stevens, 1975 ). Recent literature has attempted to account for the development of the computational and neural mechanisms that underlie interactions between quantitative dimensions. We review evidence that there are fundamental cognitive and neural relations among different quantitative dimensions (number, size, time, pitch, loudness, and brightness). Then, drawing on theoretical frameworks that explain phenomena from cross-modal perception, we outline some possible conceptualizations for how different quantitative dimensions could come to be related over both ontogenetic and phylogenetic time scales.

  13. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  14. Quantitative Information on Oncology Prescription Drug Websites.

    Science.gov (United States)

    Sullivan, Helen W; Aikin, Kathryn J; Squiers, Linda B

    2016-09-02

    Our objective was to determine whether and how quantitative information about drug benefits and risks is presented to consumers and healthcare professionals on cancer-related prescription drug websites. We analyzed the content of 65 active cancer-related prescription drug websites. We assessed the inclusion and presentation of quantitative information for two audiences (consumers and healthcare professionals) and two types of information (drug benefits and risks). Websites were equally likely to present quantitative information for benefits (96.9 %) and risks (95.4 %). However, the amount of the information differed significantly: Both consumer-directed and healthcare-professional-directed webpages were more likely to have quantitative information for every benefit (consumer 38.5 %; healthcare professional 86.1 %) compared with every risk (consumer 3.1 %; healthcare professional 6.2 %). The numeric and graphic presentations also differed by audience and information type. Consumers have access to quantitative information about oncology drugs and, in particular, about the benefits of these drugs. Research has shown that using quantitative information to communicate treatment benefits and risks can increase patients' and physicians' understanding and can aid in treatment decision-making, although some numeric and graphic formats are more useful than others.

  15. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  16. Quantitative Imaging in Cancer Clinical Trials.

    Science.gov (United States)

    Yankeelov, Thomas E; Mankoff, David A; Schwartz, Lawrence H; Lieberman, Frank S; Buatti, John M; Mountz, James M; Erickson, Bradley J; Fennessy, Fiona M M; Huang, Wei; Kalpathy-Cramer, Jayashree; Wahl, Richard L; Linden, Hannah M; Kinahan, Paul E; Zhao, Binsheng; Hylton, Nola M; Gillies, Robert J; Clarke, Laurence; Nordstrom, Robert; Rubin, Daniel L

    2016-01-15

    As anticancer therapies designed to target specific molecular pathways have been developed, it has become critical to develop methods to assess the response induced by such agents. Although traditional, anatomic CT, and MRI examinations are useful in many settings, increasing evidence suggests that these methods cannot answer the fundamental biologic and physiologic questions essential for assessment and, eventually, prediction of treatment response in the clinical trial setting, especially in the critical period soon after treatment is initiated. To optimally apply advances in quantitative imaging methods to trials of targeted cancer therapy, new infrastructure improvements are needed that incorporate these emerging techniques into the settings where they are most likely to have impact. In this review, we first elucidate the needs for therapeutic response assessment in the era of molecularly targeted therapy and describe how quantitative imaging can most effectively provide scientifically and clinically relevant data. We then describe the tools and methods required to apply quantitative imaging and provide concrete examples of work making these advances practically available for routine application in clinical trials. We conclude by proposing strategies to surmount barriers to wider incorporation of these quantitative imaging methods into clinical trials and, eventually, clinical practice. Our goal is to encourage and guide the oncology community to deploy standardized quantitative imaging techniques in clinical trials to further personalize care for cancer patients and to provide a more efficient path for the development of improved targeted therapies.

  17. Theory and practice in quantitative genetics.

    Science.gov (United States)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C; van Baal, G Caroline M; von Hjelmborg, Jacob B; Iachine, Ivan; Boomsma, Dorret I

    2003-10-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each, we show how the theoretical biometrical model can be translated into algebraic equations that may be used to generate scripts for statistical genetic software packages, such as Mx, Lisrel, SOLAR, or MERLIN. For using the former program a web-library (available from http://www.psy.vu.nl/mxbib) has been developed of freely available scripts that can be used to conduct all genetic analyses described in this paper.

  18. Propagating Qualitative Values Through Quantitative Equations

    Science.gov (United States)

    Kulkarni, Deepak

    1992-01-01

    In most practical problems where traditional numeric simulation is not adequate, one need to reason about a system with both qualitative and quantitative equations. In this paper, we address the problem of propagating qualitative values represented as interval values through quantitative equations. Previous research has produced exponential-time algorithms for approximate solution of the problem. These may not meet the stringent requirements of many real time applications. This paper advances the state of art by producing a linear-time algorithm that can propagate a qualitative value through a class of complex quantitative equations exactly and through arbitrary algebraic expressions approximately. The algorithm was found applicable to Space Shuttle Reaction Control System model.

  19. The interpretation of quantitative microbial data

    DEFF Research Database (Denmark)

    Ribeiro Duarte, Ana Sofia

    . Hence, these models need to be validated with independent data for conditions of real food before use in QMRA. The overall goal of the work presented in this thesis is to study different factors related to quantitative microbial data that may have an impact on the outcome ofQMRA, in order to find...... prevalence and high concentration). Also, a zeroinflation tends to improve the accuracy of the risk estimates. In manuscript III (“Variability and uncertainty in the evaluation of predictive models with literature data – consequences to quantitative microbiological risk assessment”) it is assessed how......Foodborne diseases carry important social, health, political and economic consequences. Quantitative microbiological risk assessment (QMRA) is a science based tool used to estimate the risk that foodborne pathogens pose to human health, i.e. it estimates the number of cases of human foodborne...

  20. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  1. Quantitative Appearance Inspection for Film Coated Tablets.

    Science.gov (United States)

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  2. Mapping Quantitative Trait Loci in Yeast.

    Science.gov (United States)

    Liti, Gianni; Warringer, Jonas; Blomberg, Anders

    2017-08-01

    Natural Saccharomyces strains isolated from the wild differ quantitatively in molecular and organismal phenotypes. Quantitative trait loci (QTL) mapping is a powerful approach for identifying sequence variants that alter gene function. In yeast, QTL mapping has been used in designed crosses to map functional polymorphisms. This approach, outlined here, is often the first step in understanding the molecular basis of quantitative traits. New large-scale sequencing surveys have the potential to directly associate genotypes with organismal phenotypes, providing a broader catalog of causative genetic variants. Additional analysis of intermediate phenotypes (e.g., RNA, protein, or metabolite levels) can produce a multilayered and integrated view of individual variation, producing a high-resolution view of the genotype-phenotype map. © 2017 Cold Spring Harbor Laboratory Press.

  3. Quantitative flaw characterization with ultrasonic phased arrays

    Science.gov (United States)

    Engle, Brady John

    Ultrasonic nondestructive evaluation (NDE) is a critical diagnostic tool in many industries. It is used to characterize potentially dangerous flaws in critical components for aerospace, automotive, and energy applications. The use of phased array transducers allows for the extension of traditional techniques and the introduction of new methods for quantitative flaw characterization. An equivalent flaw sizing technique for use in time-of-flight diffraction setups is presented that provides an estimate of the size and orientation of isolated cracks, surface-breaking cracks, and volumetric flaws such as voids and inclusions. Experimental validation is provided for the isolated crack case. A quantitative imaging algorithm is developed that corrects for system effects and wave propagation, making the images formed directly related to the properties of the scatterer present. Simulated data is used to form images of cylindrical and spherical inclusions. The contributions of different signals to the image formation process are discussed and examples of the quantitative nature of the images are shown.

  4. Practical aspects of quantitative confocal microscopy.

    Science.gov (United States)

    Murray, John M

    2013-01-01

    Confocal microscopes are in principle well suited for quantitative imaging. The 3D fluorophore distribution in a specimen is transformed by the microscope optics and detector into the 2D intensity distribution of a digital image by a linear operation, a convolution. If multiple 2D images of the specimen at different focal planes are obtained, then the original 3D distribution in the specimen can be reconstructed. This reconstruction is a low-pass spatially filtered representation of the original, but quantitatively preserves relative fluorophore concentrations, with of course some limitations on accuracy and precision due to aberrations and noise. Given appropriate calibration, absolute fluorophore concentrations are accessible. A few simple guidelines are given for setting up confocal microscopes and checking their performance. With a little care, the images collected should be suitable for most types of quantitative analysis.

  5. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  6. White-light Quantitative Phase Imaging Unit

    CERN Document Server

    Baek, YoonSeok; Yoon, Jonghee; Kim, Kyoohyun; Park, YongKeun

    2016-01-01

    We introduce the white light quantitative phase imaging unit (WQPIU) as a practical realization of quantitative phase imaging (QPI) on standard microscope platforms. The WQPIU is a compact stand-alone unit which measures sample induced phase delay under white-light illumination. It does not require any modification of the microscope or additional accessories for its use. The principle of the WQPIU based on lateral shearing interferometry and phase shifting interferometry provides a cost-effective and user-friendly use of QPI. The validity and capacity of the presented method are demonstrated by measuring quantitative phase images of polystyrene beads, human red blood cells, HeLa cells and mouse white blood cells. With speckle-free imaging capability due to the use of white-light illumination, the WQPIU is expected to expand the scope of QPI in biological sciences as a powerful but simple imaging tool.

  7. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  8. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  9. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    Duplex Vector Flow Imaging (VFI) imaging is introduced as a replacement for spectral Doppler, as it automatically can yield fully quantitative flow estimates without angle correction. Continuous VFI data over 9 s for 10 pulse cycles were acquired by a 3 MHz convex probe connected to the SARUS......L/stroke (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently...

  10. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  11. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  12. A quantitative approach to weighted Carleson condition

    Directory of Open Access Journals (Sweden)

    Rivera-Ríos Israel P.

    2017-01-01

    Full Text Available Quantitative versions of weighted estimates obtained by F. Ruiz and J.L. Torrea [30, 31] for the operator are obtained. As a consequence, some sufficient conditions for the boundedness of Min the two weight setting in the spirit of the results obtained by C. Pérez and E. Rela [26] and very recently by M. Lacey and S. Spencer [17] for the Hardy-Littlewood maximal operator are derived. As a byproduct some new quantitative estimates for the Poisson integral are obtained.

  13. Absolute quantitation of protein posttranslational modification isoform.

    Science.gov (United States)

    Yang, Zhu; Li, Ning

    2015-01-01

    Mass spectrometry has been widely applied in characterization and quantification of proteins from complex biological samples. Because the numbers of absolute amounts of proteins are needed in construction of mathematical models for molecular systems of various biological phenotypes and phenomena, a number of quantitative proteomic methods have been adopted to measure absolute quantities of proteins using mass spectrometry. The liquid chromatography-tandem mass spectrometry (LC-MS/MS) coupled with internal peptide standards, i.e., the stable isotope-coded peptide dilution series, which was originated from the field of analytical chemistry, becomes a widely applied method in absolute quantitative proteomics research. This approach provides more and more absolute protein quantitation results of high confidence. As quantitative study of posttranslational modification (PTM) that modulates the biological activity of proteins is crucial for biological science and each isoform may contribute a unique biological function, degradation, and/or subcellular location, the absolute quantitation of protein PTM isoforms has become more relevant to its biological significance. In order to obtain the absolute cellular amount of a PTM isoform of a protein accurately, impacts of protein fractionation, protein enrichment, and proteolytic digestion yield should be taken into consideration and those effects before differentially stable isotope-coded PTM peptide standards are spiked into sample peptides have to be corrected. Assisted with stable isotope-labeled peptide standards, the absolute quantitation of isoforms of posttranslationally modified protein (AQUIP) method takes all these factors into account and determines the absolute amount of a protein PTM isoform from the absolute amount of the protein of interest and the PTM occupancy at the site of the protein. The absolute amount of the protein of interest is inferred by quantifying both the absolute amounts of a few PTM

  14. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  15. Flipping Quantitative Classes: A Triple Win

    Science.gov (United States)

    Swart, William; Wuensch, Karl L.

    2016-01-01

    In the "flipped" class, students use online materials to learn what is traditionally learned by attending lectures, and class time is used for interactive group learning. A required quantitative business class was taught as a flipped classroom in an attempt to improve student satisfaction in the course and reduce the "transactional…

  16. A quantitative lubricant test for deep drawing

    DEFF Research Database (Denmark)

    Olsson, David Dam; Bay, Niels; Andreasen, Jan L.

    2010-01-01

    A tribological test for deep drawing has been developed by which the performance of lubricants may be evaluated quantitatively measuring the maximum backstroke force on the punch owing to friction between tool and workpiece surface. The forming force is found not to give useful information...

  17. Christhin: Quantitative Analysis of Thin Layer Chromatography

    CERN Document Server

    Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda

    2012-01-01

    Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

  18. Unifying Quantitative Methodology in Social Research.

    Science.gov (United States)

    Willson, Victor L.

    A case is made for representing quantitative methods in use in the social sciences within a unified framework based on structural equation methodology (SEM). Most of the methods now in use are shown in their SEM representation. It is suggested that the visual and verbal representations of SEM are of most use, while specific estimation and…

  19. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  20. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  1. Quantitative real-time imaging of glutathione

    Science.gov (United States)

    Glutathione plays many important roles in biological processes; however, the dynamic changes of glutathione concentrations in living cells remain largely unknown. Here, we report a reversible reaction-based fluorescent probe—designated as RealThiol (RT)—that can quantitatively monitor the real-time ...

  2. Quantitative phosphoproteomics to characterize signaling networks

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Blagoev, Blagoy

    2012-01-01

    Reversible protein phosphorylation is involved in the regulation of most, if not all, major cellular processes via dynamic signal transduction pathways. During the last decade quantitative phosphoproteomics have evolved from a highly specialized area to a powerful and versatile platform for analy......Reversible protein phosphorylation is involved in the regulation of most, if not all, major cellular processes via dynamic signal transduction pathways. During the last decade quantitative phosphoproteomics have evolved from a highly specialized area to a powerful and versatile platform...... and quantify thousands of phosphorylations, thus providing extensive overviews of the cellular signaling networks. As a result of these developments quantitative phosphoproteomics have been applied to study processes as diverse as immunology, stem cell biology and DNA damage. Here we review the developments...... in phosphoproteomics technology that have facilitated the application of phosphoproteomics to signaling networks and introduce examples of recent system-wide applications of quantitative phosphoproteomics. Despite the great advances in phosphoproteomics technology there are still several outstanding issues and we...

  3. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  4. The Sampling Issues in Quantitative Research

    Science.gov (United States)

    Delice, Ali

    2010-01-01

    A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…

  5. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  6. Exploring Academic Voice in Multimodal Quantitative Texts

    Directory of Open Access Journals (Sweden)

    Robert Prince

    2014-10-01

    Full Text Available Research on students’ academic literacies practices has tended to focus on the written mode in order to understand the academic conventions necessary to access Higher Education. However, the representation of quantitative information can be a challenge to many students. Quantitative information can be represented through a range of modes (such as writing, visuals and numbers and different information graphics (such as tables, charts, graphs. This paper focuses on the semiotic aspects of graphic representation in academic work, using student and published data from the Health Science, and an information graphic from the social domain as a counterpoint to explore aspects about agency and choice in academic voice in multimodal texts. It explores voice in terms of three aspects which work across modes, namely authorial engagement, citation and modality. The work of different modes and their inter-relations in quantitative texts is established, as is the use of sources in citation. We also look at the ways in which credibility and validity are established through modality. This exploration reveals that there is a complex interplay of modes in the construction of academic voice, which are largely tacit. This has implications for the way we think about and teach writing and text-making in quantitative disciplines in Higher Education.

  7. Quantitative surface characterization using a Nomarski microscope

    NARCIS (Netherlands)

    Brug, H. van; Booij, S.M.; Fähnle, O.W.; Bijl, R.J.M. van der

    2000-01-01

    The use of a Nomarski microscope for the characterization of surface features will be presented. Since a Nomarski microscope measures slope values, the shape of a surface can be followed quantitatively. Besides, a Nomarski microscope can be used to analyze surface roughness in terms of rms value and

  8. Cold Spring Harbor symposia on quantitative biology

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    Volume 55 of the Cold Spring Harbor Symposium on Quantitative Biology is dedicated to the study of the brain. The symposium was subdivided into four major sections. Papers were presented in Molecular Mechanisms for Signalling; Neural Development; Sensory and Motor Systems; and Cognitive Neuroscience. Individual papers from the symposium are abstracted separately. (MHB)

  9. Cold Spring Harbor symposia on quantitative biology

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    This volume contains the first part of the proceeding of the 53rd Cold Springs Harbor Symposium on Quantitative Biology. This years topic was Immune Recognition. Part 1, this volume, contains papers prepared by presenters of the sessions entitled Introduction, Lymphocyte Development and Receptor Selection, and Recognition by Antibodies, Antigen Recognition by T cells. (DT)

  10. Cold Spring Harbor symposia on quantitative biology

    Energy Technology Data Exchange (ETDEWEB)

    1989-01-01

    This volume contains the second part of the proceedings of the 53rd Cold Springs Harbor Symposium on Quantitative Biology. This years topic was Immune Recognition. This volume, part 2, contains papers prepared by presenters for two sessions entitled Signals for Lymphocyte Activation, Proliferation, and Adhesion, and entitled Tolerance and Self Recognition. (DT)

  11. Assessing Quantitative Reasoning in Young Children

    Science.gov (United States)

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Barros, Rossana

    2015-01-01

    Before starting school, many children reason logically about concepts that are basic to their later mathematical learning. We describe a measure of quantitative reasoning that was administered to children at school entry (mean age 5.8 years) and accounted for more variance in a mathematical attainment test than general cognitive ability 16 months…

  12. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  13. Quantitative analysis of arm movement smoothness

    Science.gov (United States)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  14. Quantitative Mapping of Large Area Graphene Conductance

    DEFF Research Database (Denmark)

    Buron, Jonas Christian Due; Petersen, Dirch Hjorth; Bøggild, Peter

    2012-01-01

    We present quantitative mapping of large area graphene conductance by terahertz time-domain spectroscopy and micro four point probe. We observe a clear correlation between the techniques and identify the observed systematic differences to be directly related to imperfections of the graphene sheet...

  15. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  16. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  17. Uncertainty in Quantitative Electron Probe Microanalysis

    Science.gov (United States)

    Heinrich, Kurt F. J.

    2002-01-01

    Quantitative electron probe analysis is based on models based on the physics or x-ray generation, empirically adjusted to the analyses of specimens of known composition. Their accuracy can be estimated by applying them to a set of specimens of presumably well-known composition. PMID:27446746

  18. Combining Qualitative and Quantitative Data: An Example.

    Science.gov (United States)

    Sikka, Anjoo; And Others

    Methodology from an ongoing research study to validate teaching techniques for deaf and blind students provides an example of the ways that several types of quantitative and qualitative data can be combined in analysis. Four teacher and student pairs were selected. The students were between 14 and 21 years old, had both auditory and visual…

  19. Quantitative multiplex detection of pathogen biomarkers

    Energy Technology Data Exchange (ETDEWEB)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  20. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior di

  1. On Bounding Problems of Quantitative Information Flow

    CERN Document Server

    Yasuoka, Hirotoshi

    2011-01-01

    Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, belief, and channel capacity. This paper investigates the hardness of precisely checking the quantitative information flow of a program according to such definitions. More precisely, we study the "bounding problem" of quantitative information flow, defined as follows: Given a program M and a positive real number q, decide if the quantitative information flow of M is less than or equal to q. We prove that the bounding problem is not a k-safety property for any k (even when q is fixed, for the Shannon-entropy-based definition with the uniform distribution), and therefore is not amenable to the self-composition technique that has been successfully applied to checking non-interference. We also prove complexity theoretic hardness results for the case when the program is restricted to loop-free boolean programs. Specifically, we show...

  2. Some Epistemological Considerations Concerning Quantitative Analysis

    Science.gov (United States)

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  3. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  4. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Pieters, W.; Arnold, F.; Stoelinga, M.I.A.

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Therefore, penetration testing has thus far been used as a qualitative research method. To enable quantitative approaches to security risk management, in

  5. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  6. Quantitative multiplex detection of pathogen biomarkers

    Science.gov (United States)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  7. Values in Qualitative and Quantitative Research

    Science.gov (United States)

    Duffy, Maureen; Chenail, Ronald J.

    2008-01-01

    The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…

  8. 78 FR 52166 - Quantitative Messaging Research

    Science.gov (United States)

    2013-08-22

    ... COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures Trading Commission. ACTION: Notice... qualitative message testing research (for which CFTC received fast-track OMB approval) and is necessary to... research (for which CFTC received fast-track OMB approval) and is necessary to identify, with...

  9. Research essentials. How to critique quantitative research.

    Science.gov (United States)

    Clarke, Sharon; Collier, Sue

    2015-11-01

    QUANTITATIVE RESEARCH is a systematic approach to investigating numerical data and involves measuring or counting attributes, that is quantities. Through a process of transforming information that is collected or observed, the researcher can often describes a situation or event, answering the 'what' and 'how many' questions about a situation ( Parahoo 2014 ).

  10. Quantitative DNA Methylation Profiling in Cancer.

    Science.gov (United States)

    Ammerpohl, Ole; Haake, Andrea; Kolarova, Julia; Siebert, Reiner

    2016-01-01

    Epigenetic mechanisms including DNA methylation are fundamental for the regulation of gene expression. Epigenetic alterations can lead to the development and the evolution of malignant tumors as well as the emergence of phenotypically different cancer cells or metastasis from one single tumor cell. Here we describe bisulfite pyrosequencing, a technology to perform quantitative DNA methylation analyses, to detect aberrant DNA methylation in malignant tumors.

  11. New Quantitative Study for Dissertations Repository System

    CERN Document Server

    Alshammari, Fahad H; Zaidan, M A; Hmood, Ali K; Zaidan, B B; Zaidan, A A

    2010-01-01

    In the age of technology, the information communication technology becomes very important especially in education field. Students must be allowed to learn anytime, anywhere and at their own place. The facility of library in the university should be developed. In this paper we are going to present new Quantitative Study for Dissertations Repository System and also recommend future application of the approach.

  12. Interval Mapping of Multiple Quantitative Trait Loci

    NARCIS (Netherlands)

    Jansen, Ritsert C.

    1993-01-01

    The interval mapping method is widely used for the mapping of quantitative trait loci (QTLs) in segregating generations derived from crosses between inbred lines. The efficiency of detecting and the accuracy of mapping multiple QTLs by using genetic markers are much increased by employing multiple Q

  13. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  14. Quantitative Techniques in PET-CT Imaging

    NARCIS (Netherlands)

    Basu, Sandip; Zaidi, Habib; Holm, Soren; Alavi, Abass

    2011-01-01

    The appearance of hybrid PET/CT scanners has made quantitative whole body scanning of radioactive tracers feasible. This paper deals with the novel concepts for assessing global organ function and disease activity based on combined functional (PET) and structural (CT or MR) imaging techniques, their

  15. Seniors' Online Communities: A Quantitative Content Analysis

    Science.gov (United States)

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  16. Evolutionary quantitative genetics of nonlinear developmental systems.

    Science.gov (United States)

    Morrissey, Michael B

    2015-08-01

    In quantitative genetics, the effects of developmental relationships among traits on microevolution are generally represented by the contribution of pleiotropy to additive genetic covariances. Pleiotropic additive genetic covariances arise only from the average effects of alleles on multiple traits, and therefore the evolutionary importance of nonlinearities in development is generally neglected in quantitative genetic views on evolution. However, nonlinearities in relationships among traits at the level of whole organisms are undeniably important to biology in general, and therefore critical to understanding evolution. I outline a system for characterizing key quantitative parameters in nonlinear developmental systems, which yields expressions for quantities such as trait means and phenotypic and genetic covariance matrices. I then develop a system for quantitative prediction of evolution in nonlinear developmental systems. I apply the system to generating a new hypothesis for why direct stabilizing selection is rarely observed. Other uses will include separation of purely correlative from direct and indirect causal effects in studying mechanisms of selection, generation of predictions of medium-term evolutionary trajectories rather than immediate predictions of evolutionary change over single generation time-steps, and the development of efficient and biologically motivated models for separating additive from epistatic genetic variances and covariances.

  17. On Measuring Quantitative Interpretations of Reasonable Doubt

    Science.gov (United States)

    Dhami, Mandeep K.

    2008-01-01

    Beyond reasonable doubt represents a probability value that acts as the criterion for conviction in criminal trials. I introduce the membership function (MF) method as a new tool for measuring quantitative interpretations of reasonable doubt. Experiment 1 demonstrated that three different methods (i.e., direct rating, decision theory based, and…

  18. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2017-06-28

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m(2)) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  19. Quantitative and Econometric Methodologies in the Study of Civil War

    OpenAIRE

    2014-01-01

    This chapter provides an overview of the quantitative study of civil war, focusing on the development of quantitative conflict studies, the basics of the quantitative method, the prominent sources of civil conflict data, and the strengths and weaknesses of using quantitative methods to analyse civil war.

  20. Quantitative and Econometric Methodologies in the Study of Civil War

    OpenAIRE

    Clayton, Govinda

    2014-01-01

    This chapter provides an overview of the quantitative study of civil war, focusing on the development of quantitative conflict studies, the basics of the quantitative method, the prominent sources of civil conflict data, and the strengths and weaknesses of using quantitative methods to analyse civil war.

  1. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  2. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  3. High-resolution flow field measurements in the rotor passage of a low-mach number turbine for different tip geometries; Hochaufgeloeste Stroemungsfeldvermessungen in der Rotorpassage einer Niedermachzahlturbine fuer verschiedene Schaufelspitzengeometrien

    Energy Technology Data Exchange (ETDEWEB)

    Kegalj, Martin

    2013-11-01

    In axial turbines tip leakage forms a large portion of the overall losses. Applying a shroud is very aerodynamically useful, but the higher mechanical loads of the revolving rotor blading exposed to a high thermal load and the higher costs suggest a shroudless configuration is better. The main parameter in the tip leakage loss is the tip gap height, which cannot be reduced arbitrarily as a running gap is necessary due to thermal expansion and vibration of the jet engine. The pressure ratio between pressure and suction of the rotor blade forces the fluid over the blade tip and leads to the formation of the tip leakage vortex. Reduced turning and losses caused by vortices and subsequent mixing are responsible for the reduced efficiency. Using a squealer cavity on the flat blade tip is a feasible way to reduce the aerodynamic losses. A portion of the kinetic energy of the tip leakage flow is dissipated while entering the cavity; the flow exiting the cavity enters the passage with reduced momentum and reduced tip gap mass flow. A 1(1)/(2) stage low mach number turbine was used to investigate the influence of tip geometry. Aerodynamic measurements, performed with five-hole probes, two-component hot-wire anemometer, unsteady wall pressure sensors, stereo and borescopic particle-image-velocimetry setups and oil and dye flow visualization, found small differences in the flow velocities and angles between the flat and squealer tip configuration in the measurement planes downstream of the rotor. The measurement uncertainty proves the difficulty of determining the influence of the squealer cavity on the blade row outflow with global measurement data. To gather information on the flow close to the casing inside the rotor passage is only possible with non-intrusive laser measurement techniques. Comparison of the different tip geometries is still difficult due to the small differences in the absolute flow data. The use of the {lambda}{sub 2} vortex criterion enables an objective identification and localization of vortex structures, thus allowing the calculation of integral vortex values, such as the rotational kinetic energy and the cross-section area. A comparison of the tip geometry configuration's flat tip and squealer tip shows decreased vortex area and kinetic energy for the cavity tip, suggesting an influence on the tip gap flow and tip leakage vortex. Combing the results with the oil and dye flow visualization on the rotor tip and the unsteady wall pressure above the rotor, a complete and detailed picture of the tip leakage flow and the vortices close to the rotor tip is possible. The reduction of vortex area and kinetic energy causes a relocation of the passage vortex, leading to a lower disturbance of the passage flow. Unsteady numerical flow simulations of the turbine were treated with the same algorithms for the calculation of the vortex area and kinetic energy. The alterations of the vortex values due to the squealer tip are in the same order as the measurements (approximately 10 % reduction), and validates the numerical simulations in an area of highly complex flow phenomena at the rotor tip. Good agreement between measurement and simulation showed a reduction of the tip leakage mass flow by 9.1 % and the increase of the isentropic stage efficiency by 0.24 %, while the efficiency of the 1(1)/(2) stages improved by 0.34 %.

  4. Bringing quality and meaning to quantitative data - Bringing quantitative evidence to qualitative observation

    DEFF Research Database (Denmark)

    Karpatschof, Benny

    2007-01-01

    Based on the author's methodological theory defining the distinctive properties of quantitative and qualitative method the article demonstrates the possibilities and advantages of combining the two types of investigation in the same research project. The project being an effect study...

  5. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  6. Quantitative Adaptive RED in Differentiated Service Networks

    Institute of Scientific and Technical Information of China (English)

    LONG KePing(隆克平); WANG Qian(王茜); CHENG ShiDuan(程时端); CHEN JunLiang(陈俊亮)

    2003-01-01

    This paper derives a quantitative model between RED (Random Early Detection)maxp and committed traffic rate for token-based marking schemes in DiffServ IP networks. Then,a DiffServ Quantitative RED (DQRED) is presented, which can adapt its dropping probabilityto marking probability of the edge router to reflect not only the sharing bandwidth but also therequirement of performance of these services. Hence, DQRED can cooperate with marking schemesto guarantee fairness between different DiffServ AF class services. A new marking probabilitymetering algorithm is also proposed to cooperate with DQRED. Simulation results verify thatDQRED mechanism can not only control congestion of DiffServ network very well, but also satisfydifferent quality requirements of AF class service. The performance of DQRED is better than thatof WRED.

  7. Proteome-Wide Quantitation by SILAC

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Blagoev, Blagoy

    2010-01-01

    isotope labeling by amino acids in cell culture (SILAC) has emerged as a powerful and versatile approach for proteome-wide quantitation by mass spectrometry. SILAC utilizes the cells' own metabolism to incorporate isotopically labeled amino acids into its proteome which can be mixed with the proteome...... detailed procedure for performing SILAC-based experiment for proteome-wide quantitation, including a protocol for optimizing SILAC labeling. We also provide an update on the most recent developments of this technique....... of unlabeled cells and differences in protein expression can easily be read out by comparing the abundance of the labeled versus unlabeled proteins. SILAC has been applied to numerous different cell lines and the technique has been adapted for a wide range of experimental procedures. In this chapter we provide...

  8. Next generation quantitative genetics in plants.

    Science.gov (United States)

    Jiménez-Gómez, José M

    2011-01-01

    Most characteristics in living organisms show continuous variation, which suggests that they are controlled by multiple genes. Quantitative trait loci (QTL) analysis can identify the genes underlying continuous traits by establishing associations between genetic markers and observed phenotypic variation in a segregating population. The new high-throughput sequencing (HTS) technologies greatly facilitate QTL analysis by providing genetic markers at genome-wide resolution in any species without previous knowledge of its genome. In addition HTS serves to quantify molecular phenotypes, which aids to identify the loci responsible for QTLs and to understand the mechanisms underlying diversity. The constant improvements in price, experimental protocols, computational pipelines, and statistical frameworks are making feasible the use of HTS for any research group interested in quantitative genetics. In this review I discuss the application of HTS for molecular marker discovery, population genotyping, and expression profiling in QTL analysis.

  9. Quantitative image analysis of celiac disease.

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-07

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  10. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  11. Quantitative Information Flow - Verification Hardness and Possibilities

    CERN Document Server

    Yasuoka, Hirotoshi

    2010-01-01

    Researchers have proposed formal definitions of quantitative information flow based on information theoretic notions such as the Shannon entropy, the min entropy, the guessing entropy, and channel capacity. This paper investigates the hardness and possibilities of precisely checking and inferring quantitative information flow according to such definitions. We prove that, even for just comparing two programs on which has the larger flow, none of the definitions is a k-safety property for any k, and therefore is not amenable to the self-composition technique that has been successfully applied to precisely checking non-interference. We also show a complexity theoretic gap with non-interference by proving that, for loop-free boolean programs whose non-interference is coNP-complete, the comparison problem is #P-hard for all of the definitions. For positive results, we show that universally quantifying the distribution in the comparison problem, that is, comparing two programs according to the entropy based definit...

  12. Quantitative image analysis of celiac disease

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  13. Physiologic basis for understanding quantitative dehydration assessment.

    Science.gov (United States)

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  14. A quantitative description for efficient financial markets

    Science.gov (United States)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  15. Quantitative morphological studies of the parathyroid gland

    OpenAIRE

    Larsson, Hans-Olov

    1983-01-01

    This work is based upon a series of quantitative morphological studies of the parathyroid glands of Mongolian gerbils and rats. Standard stereological methods were used on light and electron microscopical levels. Subclassification of the chief cells based on the staining affinity and electron density of the cytoplasm was not correlated with contents (volume and surface densities) of organelles. Compared to fixation by immersion, fixation by perfusion caused a remarkable reduction in the numbe...

  16. Quantitative determination of atmospheric hydroperoxyl radical

    Science.gov (United States)

    Springston, Stephen R.; Lloyd, Judith; Zheng, Jun

    2007-10-23

    A method for the quantitative determination of atmospheric hydroperoxyl radical comprising: (a) contacting a liquid phase atmospheric sample with a chemiluminescent compound which luminesces on contact with hydroperoxyl radical; (b) determining luminescence intensity from the liquid phase atmospheric sample; and (c) comparing said luminescence intensity from the liquid phase atmospheric sample to a standard luminescence intensity for hydroperoxyl radical. An apparatus for automating the method is also included.

  17. Physiologic Basis for Understanding Quantitative Dehydration Assessment

    Science.gov (United States)

    2012-01-01

    Quantitative Dehydration Sb. GRANT NUMBER Assessment Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Sd. PROJECT NUMBER Samuel Cheuvront; Robert Kenefick; Nisha...of the statistical prob- ability for dehydration. Semantic descriptors provide a scale that aligned with statistical probability to communicate the...cryoscopy. JAMA 1968;205:506–12. 30. Mange K, Matsuura D, Cizman B, Soto H, Ziyadeh FN, Goldfarb S, Neilson EG. Language guiding therapy: the case of

  18. A Note on Quantitative Definition of Risk

    Institute of Scientific and Technical Information of China (English)

    李力; 赵联文; 杨宁

    2004-01-01

    Risk analysis has become more and more important in practical application, but there has not been a widely accepted frame work and model for the research. Motivated by factor analysis method and Kaplan-Garrick's quantitative definition of risk, a general risk model was established based on analyzing risk situations and employing information system and evidence theory, and Kaplan-Garrick's result was improved to introduce a frame word for analyzing and managing risk.

  19. Quantitative Cerebral Blood Flow Measurements Using MRI

    OpenAIRE

    Muir, Eric R; Watts, Lora Talley; Tiwari, Yash Vardhan; Bresnen, Andrew; Timothy Q Duong

    2014-01-01

    Magnetic resonance imaging utilized as a quantitative and noninvasive method to image cerebral blood flow. The two most common techniques used to detect cerebral blood flow are dynamic susceptibility contrast (DSC) perfusion MRI and arterial spin labeling perfusion MRI. Herein we describe the use of these two techniques to measure cerebral blood flow in rodents, including methods, analysis, and important considerations when utilizing these techniques.

  20. Quantitative genetic studies of antisocial behaviour

    OpenAIRE

    Viding, Essi; Larsson, Henrik; Jones, Alice P.

    2008-01-01

    This paper will broadly review the currently available twin and adoption data on antisocial behaviour (AB). It is argued that quantitative genetic research can make a significant contribution to further the understanding of how AB develops. Genetically informative study designs are particularly useful for investigating several important questions such as whether: the heritability estimates vary as a function of assessment method or gender; the relative importance of genetic and environmental ...

  1. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  2. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  3. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  4. Quantitative Chemical Indices of Weathered Igneous Rocks

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A study was conducted to compare the effectiveness of different weathering indices for characterising weathered igneous rocks of Hong Kong. Among eight chemical indices evaluated in this study, the Parker index has been found most suitable for a quantitative description of state of weathering. Based on geochemical results of 174 samples, the index decreases almost linearly with an increasing extent of weathering. The results enable a better understanding of the modification of geotechnical properties of igneous rocks associated with weathering processes.

  5. A Quantitative Appraisal of Adjustment Lending

    OpenAIRE

    Bela Balassa

    1989-01-01

    This paper presents a quantitative analysis of adjustment programmes. This is done by charting changes in various performance indicators following the receipt of the first adjustment loan, and by further contrasting the results with those for the comparator group of countries that did not receive adjustment loans. It is found that the average decline in the GDP growth rate in the loan-recipient countries was less than in the comparator groups. Similar results were obtained in regard to per ca...

  6. L'histoire quantitative: reflexions epistemologiques

    OpenAIRE

    Robert, Jean-Louis

    1983-01-01

    This article outlines the present state of the debate on quantitative history in France. J.-L. Robert observes a transformation of "classic" approaches to social and economic history as a result of new statistical tools, now available to historians. He advocates a careful methodological assessment of these techniques of research and discusses in detail problems like the stability of indicators over time, the application of sampling-techniques, and the limits of quantification in history.

  7. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  8. ONLINE INSTRUMENTS IN QUANTITATIVE MARKETING RESEARCH

    OpenAIRE

    STOICA, Ivona

    2011-01-01

    The Internet has brought great benefits, revolutionizing the world of marketing instruments used for advertising, bringing new areas such as search engine marketing, online marketing research, is a mediator of diverse individuals gathered in communities where borders no longer matter, and information is immeasurable. The impact of Internet use in marketing research refers to collecting data through online quantitative survey. This paper has its duty to reveal the importance of online cantitat...

  9. Using Local Data To Advance Quantitative Literacy

    OpenAIRE

    Stephen Sweet; Susanne Morgan; Danette Ifert Johnson

    2008-01-01

    In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in...

  10. Quantitative image processing in fluid mechanics

    Science.gov (United States)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  11. Quantitative microbial ecology through stable isotope probing.

    Science.gov (United States)

    Hungate, Bruce A; Mau, Rebecca L; Schwartz, Egbert; Caporaso, J Gregory; Dijkstra, Paul; van Gestel, Natasja; Koch, Benjamin J; Liu, Cindy M; McHugh, Theresa A; Marks, Jane C; Morrissey, Ember M; Price, Lance B

    2015-11-01

    Bacteria grow and transform elements at different rates, and as yet, quantifying this variation in the environment is difficult. Determining isotope enrichment with fine taxonomic resolution after exposure to isotope tracers could help, but there are few suitable techniques. We propose a modification to stable isotope probing (SIP) that enables the isotopic composition of DNA from individual bacterial taxa after exposure to isotope tracers to be determined. In our modification, after isopycnic centrifugation, DNA is collected in multiple density fractions, and each fraction is sequenced separately. Taxon-specific density curves are produced for labeled and nonlabeled treatments, from which the shift in density for each individual taxon in response to isotope labeling is calculated. Expressing each taxon's density shift relative to that taxon's density measured without isotope enrichment accounts for the influence of nucleic acid composition on density and isolates the influence of isotope tracer assimilation. The shift in density translates quantitatively to isotopic enrichment. Because this revision to SIP allows quantitative measurements of isotope enrichment, we propose to call it quantitative stable isotope probing (qSIP). We demonstrated qSIP using soil incubations, in which soil bacteria exhibited strong taxonomic variations in (18)O and (13)C composition after exposure to [(18)O]water or [(13)C]glucose. The addition of glucose increased the assimilation of (18)O into DNA from [(18)O]water. However, the increase in (18)O assimilation was greater than expected based on utilization of glucose-derived carbon alone, because the addition of glucose indirectly stimulated bacteria to utilize other substrates for growth. This example illustrates the benefit of a quantitative approach to stable isotope probing.

  12. Quantitatively Probing the Al Distribution in Zeolites

    Energy Technology Data Exchange (ETDEWEB)

    Vjunov, Aleksei; Fulton, John L.; Huthwelker, Thomas; Pin, Sonia; Mei, Donghai; Schenter, Gregory K.; Govind, Niranjan; Camaioni, Donald M.; Hu, Jian Z.; Lercher, Johannes A.

    2014-06-11

    The degree of substitution of Si4+ by Al3+ in the oxygen-terminated tetrahedra (Al T-sites) of zeolites determines the concentration of ion-exchange and Brønsted acid sites. As the location of the tetrahedra and the associated subtle variations in bond angles influence the acid strength, quantitative information about Al T-sites in the framework is critical to rationalize catalytic properties and to design new catalysts. A quantitative analysis is reported that uses a combination of extended X-ray absorption fine structure (EXAFS) analysis and 27Al MAS NMR spectroscopy supported by DFT-based molecular dynamics simulations. To discriminate individual Al atoms, sets of ab initio EXAFS spectra for various T-sites are generated from DFT-based molecular dynamics simulations allowing quantitative treatment of the EXAFS single- and multiple-photoelectron scattering processes out to 3-4 atom shells surrounding the Al absorption center. It is observed that identical zeolite types show dramatically different Al-distributions. A preference of Al for T-sites that are part of one or more 4-member rings in the framework over those T-sites that are part of only 5- and 6-member rings in the HBEA150 sample has been determined from a combination of these methods. This work was supported by the U. S. Department of Energy (DOE), Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences.

  13. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  14. Quantitative Prediction for Deep Mineral Exploration

    Institute of Scientific and Technical Information of China (English)

    Zhao Pengda; Cheng Qiuming; Xia Qinglin

    2008-01-01

    On reviewing the characteristics of deep mineral exploration, this article elaborates on the necessity of employing quantitative prediction to reduce uncertainty. This is caused by complexity of mineral deposit formational environments and mineralization systems as increase of exploration depth and incompleteness of geo-information from limited direct observation. The authors wish to share the idea of "seeking difference" principle in addition to the "similar analogy" principle in deep mineral exploration, especially the focus is on the new ores in depth either in an area with discovered shallow mineral deposits or in new areas where there are no sufficient mineral deposit models to be compared. An on-going research project, involving Sn and Cu mineral deposit quantitative prediction in the Gejiu (个旧) area of Yunnan (云南) Province, China, was briefly introduced to demonstrate how the "three-component" (geoanomaly-mineralization diversity-mineral deposit spectrum) theory and non-linear methods series in conjunction with advanced GIS technology, can be applied in multi-scale and multi-task deep mineral prospecting and quantitative mineral resource assessment.

  15. The use of quantitative medical sociology.

    Science.gov (United States)

    Blane, David

    2003-01-01

    The present article reviews, in relation to quantitative work on the social structure, papers published in Sociology of Health and Illness during its first 25 years. Each issue published during the years 1979-2002 has been examined; and quantitative papers, relating to various aspects of the social structure, have been identified. Such papers are found to have formed a minor but substantively significant theme within the Journal. These contributions situate the journal between sociology and social epidemiology. Articles in the Journal, for example, have been part of sociological debates about the measurement of social class, and of social epidemiological debates about the relationship between income distribution and population health. The contribution of Sociology of Health and Illness to a number of such debates is reviewed. The article concludes that the present situation, in particular the intellectual crisis in social epidemiology and social science investment in large data sets, gives the Journal the chance to build on this distinguished tradition by encouraging, through its publication policy, the further development of quantitative medical sociology.

  16. Quantitative filter forensics for indoor particle sampling.

    Science.gov (United States)

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Assessment of central venous catheter-associated infections using semi-quantitative or quantitative culture methods

    Directory of Open Access Journals (Sweden)

    E. L. Pizzolitto

    2009-01-01

    Full Text Available

    Semiquantitative (Maki and quantitative (Brun- Buisson culture techniques were employed in the diagnosis of catheter-related bloodstream infections (CRBSI in patients who have a short-term central venous catheter (inserted for 30 days. The diagnosis of CRBSI was based on the results of semiquantitative and quantitative culture of material from the removed catheters. Catheter tips (118 from 100 patients were evaluated by both methods. Semiquantitative analysis revealed 34 catheters (28.8% colonized by ≥15 colonyforming units (cfu, while quantitative cultures (34 catheters, 28.8% showed the growth of ≥103 cfu/mL. Bacteremia was confirmed in four patients by isolating microorganisms of identical species from both catheters and blood samples. Using the semiquantitative culture technique on short-term central venous catheter tips, we have shown that with a cut-off level of ≥15 cfu, the technique had 100.0% sensitivity, specificity of 68.4%, 25.0% positive predictive value (PPV and 100.0% negative predictive value (NPV, efficiency of 71.4% and a prevalence of 9.5%. The quantitative method, with a cut-off limit of ≥103 cfu/mL, gave identical values: the sensitivity was 100.0%, specificity 68.4%, positive predictive value (PPV 25.0%, negative predictive value (NPV 100.0%, efficiency 71.4% and prevalence 9.5%. We concluded that the semiquantitative and quantitative culture methods, evaluated in parallel, for the first time in Brazil, have similar sensitivity and specificity. Keywords: central venous catheter; semi-quantitative culture; quantitative culture; catheter-related bacteremia.

  18. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  19. Developing quantitative tools for measuring aspects of prisonization

    DEFF Research Database (Denmark)

    Kjær Minke, Linda

    2013-01-01

    The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners.......The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners....

  20. Impact of reconstruction parameters on quantitative I-131 SPECT

    NARCIS (Netherlands)

    van Gils, C A J; Beijst, C; van Rooij, R; de Jong, H W A M

    2016-01-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate cor

  1. Impact of reconstruction parameters on quantitative I-131 SPECT

    NARCIS (Netherlands)

    van Gils, C A J; Beijst, C; van Rooij, R; de Jong, H W A M

    2016-01-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate

  2. Quantitative wearable sensors for objective assessment of Parkinson's disease

    NARCIS (Netherlands)

    Maetzler, W.; Domingos, J.; Srulijes, K.; Ferreira, J.J.; Bloem, B.R.

    2013-01-01

    There is a rapidly growing interest in the quantitative assessment of Parkinson's disease (PD)-associated signs and disability using wearable technology. Both persons with PD and their clinicians see advantages in such developments. Specifically, quantitative assessments using wearable technology

  3. Quantitative imaging of turbulent and reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Paul, P.H. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    Quantitative digital imaging, using planar laser light scattering techniques is being developed for the analysis of turbulent and reacting flows. Quantitative image data, implying both a direct relation to flowfield variables as well as sufficient signal and spatial dynamic range, can be readily processed to yield two-dimensional distributions of flowfield scalars and in turn two-dimensional images of gradients and turbulence scales. Much of the development of imaging techniques to date has concentrated on understanding the requisite molecular spectroscopy and collision dynamics to be able to determine how flowfield variable information is encoded into the measured signal. From this standpoint the image is seen as a collection of single point measurements. The present effort aims at realizing necessary improvements in signal and spatial dynamic range, signal-to-noise ratio and spatial resolution in the imaging system as well as developing excitation/detection strategies which provide for a quantitative measure of particular flowfield scalars. The standard camera used for the study is an intensified CCD array operated in a conventional video format. The design of the system was based on detailed modeling of signal and image transfer properties of fast UV imaging lenses, image intensifiers and CCD detector arrays. While this system is suitable for direct scalar imaging, derived quantities (e.g. temperature or velocity images) require an exceptionally wide dynamic range imaging detector. To apply these diagnostics to reacting flows also requires a very fast shuttered camera. The authors have developed and successfully tested a new type of gated low-light level detector. This system relies on fast switching of proximity focused image-diode which is direct fiber-optic coupled to a cooled CCD array. Tests on this new detector show significant improvements in detection limit, dynamic range and spatial resolution as compared to microchannel plate intensified arrays.

  4. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  5. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  6. Quantitative Communication Research: Review, Trends, and Critique

    Directory of Open Access Journals (Sweden)

    Timothy R. Levine

    2013-01-01

    Full Text Available Trends in quantitative communication research are reviewed. A content analysis of 48 articles reporting original communication research published in 1988-1991 and 2008-2011 is reported. Survey research and self-report measurement remain common approaches to research. Null hypothesis significance testing remains the dominant approach to statistical analysis. Reporting the shapes of distributions, estimates of statistical power, and confidence intervals remain uncommon. Trends over time include the increased popularity of health communication and computer mediated communication as topics of research, and increased attention to mediator and moderator variables. The implications of these practices for scientific progress are critically discussed, and suggestions for the future are provided.

  7. Classical particle exchange: a quantitative treatment

    CERN Document Server

    Lancaster, Jarrett L; Titus, Aaron P

    2015-01-01

    The "classic" analogy of classical repulsive interactions via exchange of particles is revisited with a quantitative model and analyzed. This simple model based solely upon the principle of momentum conservation yields a nontrivial, conservative approximation at low energies while also including a type of "relativistic" regime in which the conservative formulation breaks down. Simulations are presented which are accessible to undergraduate students at any level in the physics curriculum as well as analytic treatments of the various regimes which should be accessible to advanced undergraduate physics majors.

  8. Qualitative and quantitative reasoning about thermodynamics

    Science.gov (United States)

    Skorstad, Gordon; Forbus, Ken

    1989-01-01

    One goal of qualitative physics is to capture the tacit knowledge of engineers and scientists. It is shown how Qualitative Process theory can be used to express concepts of engineering thermodynamics. In particular, it is shown how to integrate qualitative and quantitative knowledge to solve textbook problems involving thermodynamic cycles, such as gas turbine plants and steam power plants. These ideas were implemented in a program called SCHISM. Its analysis of a sample textbook problem is described and plans for future work are discussed.

  9. Quantitative methods in classical perturbation theory.

    Science.gov (United States)

    Giorgilli, A.

    Poincaré proved that the series commonly used in Celestial mechanics are typically non convergent, although their usefulness is generally evident. Recent work in perturbation theory has enlightened this conjecture of Poincaré, bringing into evidence that the series of perturbation theory, although non convergent in general, furnish nevertheless valuable approximations to the true orbits for a very large time, which in some practical cases could be comparable with the age of the universe. The aim of the author's paper is to introduce the quantitative methods of perturbation theory which allow to obtain such powerful results.

  10. Japanese Onomatopoeic Expressions with Quantitative Meaning

    Directory of Open Access Journals (Sweden)

    Nataliia Vitalievna KUTAFEVA

    2015-06-01

    Full Text Available Grammatical category of quantity is absent in the Japanese language but there are many different grammatical, lexical, derivational and morphological modes of expression of quantity. This paper provides an analysis of the lexical mode of expression of quantitative meanings and their semantics with the help of onomatopoeic (giongo and mimetic (gitaigo words in the Japanese language. Based on the analysis, we have distinguished the following semantic groups: mimetic words A existence of some (large or small quantity (things, phenomena and people, B degree of change of quantity; and onomatopoeic words A single sound, B repetitive sounds.

  11. Quantitative Outgassing studies in DC Electrical breakdown

    CERN Document Server

    Levinsen, Yngve Inntjore; Calatroni, Sergio; Taborelli, Mauro; Wünsch, Walter

    2010-01-01

    Breakdown in the accelerating structures sets an important limit to the performance of the CLIC linear collider. Vacuum degradation and subsequent beam instability are possible outcomes of a breakdown if too much gas is released from the cavity surface. Quantitative data of gas released by breakdowns are provided for copper (milled Cu-OFE, as-received and heat-treated), and molybdenum. These data are produced in a DC spark system based on a capacitance charged at fixed energy, and will serve as a reference for the vacuum design of the CLIC accelerating structures.

  12. Unraveling pancreatic islet biology by quantitative proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Jianying; Dann, Geoffrey P.; Liew, Chong W.; Smith, Richard D.; Kulkarni, Rohit N.; Qian, Weijun

    2011-08-01

    The pancreatic islets of Langerhans play a critical role in maintaining blood glucose homeostasis by secreting insulin and several other important peptide hormones. Impaired insulin secretion due to islet dysfunction is linked to the pathogenesis underlying both Type 1 and Type 2 diabetes. Over the past 5 years, emerging proteomic technologies have been applied to dissect the signaling pathways that regulate islet functions and gain an understanding of the mechanisms of islet dysfunction relevant to diabetes. Herein, we briefly review some of the recent quantitative proteomic studies involving pancreatic islets geared towards gaining a better understanding of islet biology relevant to metabolic diseases.

  13. Quantitative Radiological Diagnosis Of The Temporomandibular Joint

    Science.gov (United States)

    Jordan, Steven L.; Heffez, Leslie B.

    1989-05-01

    Recent impressive technological advances in imaging techniques for the human temporomandibular (tm) joint, and in enabling geometric algorithms have outpaced diagnostic analyses. The authors present a basis for systematic quantitative diagnoses that exploit the imaging advancements. A reference line, coordinate system, and transformations are described that are appropriate for tomography of the tm joint. These yield radiographic measurements (disk displacement) and observations (beaking of radiopaque dye and disk shape) that refine diagnostic classifications of anterior displacement of the condylar disk. The relevance of these techniques has been clinically confirmed. Additional geometric invariants and procedures are proposed for future clinical verification.

  14. Whole genome approaches to quantitative genetics.

    Science.gov (United States)

    Visscher, Peter M

    2009-06-01

    Apart from parent-offspring pairs and clones, relative pairs vary in the proportion of the genome that they share identical by descent. In the past, quantitative geneticists have used the expected value of sharing genes by descent to estimate genetic parameters and predict breeding values. With the possibility to genotype individuals for many markers across the genome it is now possible to empirically estimate the actual relationship between relatives. We review some of the theory underlying the variation in genetic identity, show applications to estimating genetic variance for height in humans and discuss other applications.

  15. Expermental Studies of quantitative evaluation using HPLC

    Directory of Open Access Journals (Sweden)

    Ki Rok Kwon

    2005-06-01

    Full Text Available Methods : This study was conducted to carry out quantitative evaluation using HPLC Content analysis was done using HPLC Results : According to HPLC analysis, each BVA-1 contained approximately 0.36㎍ melittin, and BVA-2 contained approximately 0.54㎍ melittin. But the volume of coating was so minute, slight difference exists between each needle. Conclusion : Above results indicate that the bee venom acupuncture can complement shortcomings of syringe usage as a part of Oriental medicine treatment, but extensive researches should be done for further verification.

  16. LC-MS systems for quantitative bioanalysis.

    Science.gov (United States)

    van Dongen, William D; Niessen, Wilfried M A

    2012-10-01

    LC-MS has become the method-of-choice in small-molecule drug bioanalysis (molecular mass Triple quadrupole MS is the established bioanalytical technique due to its unpreceded selectivity and sensitivity, but high-resolution accurate-mass MS is recently gaining ground due to its ability to provide simultaneous quantitative and qualitative analysis of drugs and their metabolites. This article discusses current trends in the field of bioanalytical LC-MS (until September 2012), and provides an overview of currently available commercial triple quadrupole MS and high-resolution LC-MS instruments as applied for the bioanalysis of small-molecule and biopharmaceutical drugs.

  17. Quantitative Assessment of the IT Agile Transformation

    Directory of Open Access Journals (Sweden)

    Orłowski Cezary

    2017-03-01

    Full Text Available The aim of this paper is to present the quantitative perspective of the agile transformation processes in IT organisations. The phenomenon of agile transformation becomes a complex challenge for an IT organisation since it has not been analysed in detail so far. There is no research on the readiness of IT organisations to realise agile transformation processes. Such processes also prove to have uncontrolled character. Therefore, to minimise the risk of failure referring to the realisation of transformation processes, it is necessary to monitor them. It is also necessary to identify and analyse such processes to ensure their continuous character.

  18. Towards quantitative measures in applied ontology

    CERN Document Server

    Hoehndorf, Robert; Gkoutos, Georgios V

    2012-01-01

    Applied ontology is a relatively new field which aims to apply theories and methods from diverse disciplines such as philosophy, cognitive science, linguistics and formal logics to perform or improve domain-specific tasks. To support the development of effective research methodologies for applied ontology, we critically discuss the question how its research results should be evaluated. We propose that results in applied ontology must be evaluated within their domain of application, based on some ontology-based task within the domain, and discuss quantitative measures which would facilitate the objective evaluation and comparison of research results in applied ontology.

  19. I. Unbound serum gold: procedure for quantitation.

    Science.gov (United States)

    Lorber, A; Vibert, G J; Harralson, A F; Simon, T M

    1983-08-01

    The unbound fraction of many drugs appears to be the therapeutically active component. However, the major problem encountered in following unbound serum gold (UBSG) concentration during chrysotherapy has been the ability to quantitate such a small quantity of gold reliably without matrix interference. The methodology detailed here overcomes these difficulties and provides an effective means of monitoring the UBSG fraction during chrysotherapy. We have observed that the unbound fraction of gold dissipates quickly after gold sodium thiomalate administration and constitutes less than 2% of the total serum gold concentration.

  20. Statistical Uncertainty in Quantitative Neutron Radiography

    CERN Document Server

    Piegsa, Florian M

    2016-01-01

    We demonstrate a novel procedure to calibrate neutron detection systems commonly used in standard neutron radiography. This calibration allows determining the uncertainties due to Poisson-like neutron counting statistics for each individual pixel of a radiographic image. The obtained statistical errors are necessary in order to perform correct quantitative analysis. This fast and convenient method is applied to real data measured at the cold neutron radiography facility ICON at the Paul Scherrer Institute. Moreover, from the results the effective neutron flux at the beam line is determined.

  1. Milankovitch radiation variations: a quantitative evaluation.

    Science.gov (United States)

    Shaw, D M; Donn, W L

    1968-12-13

    A quantitative determination of changes in the surface temperature caused by variations in insolation calculated by Milankovitch has been made through the use of the thermodynamic model of Adem. Under extreme conditions, mean coolings of 3.1 degrees and 2.7 degrees C, respectively, at latitudes 25 degrees and 65 degrees N are obtained for Milankovitch radiation cycles. At the sensitive latitude 65 degrees N, a mean cooling below the present temperature for each of the times of radiation minimum is only 1.4 degrees C. This result indicates that the Milankovitch effect is rather small to have triggered glacial climates.

  2. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  3. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  4. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...... between climate change and marine ecological variables. Of the articles with time series data (n = 186), 75% used statistics to test for a dependency of ecological variables on climate variables. We identified several common weaknesses in statistical approaches, including marginalizing other important non...

  5. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    OpenAIRE

    Robert L. Mayes; Franziska Peterson; Rachel Bonilla

    2013-01-01

    Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quanti...

  6. Another Curriculum Requirement? Quantitative Reasoning in Economics: Some First Steps

    Science.gov (United States)

    O'Neill, Patrick B.; Flynn, David T.

    2013-01-01

    In this paper, we describe first steps toward focusing on quantitative reasoning in an intermediate microeconomic theory course. We find student attitudes toward quantitative aspects of economics improve over the duration of the course (as we would hope). Perhaps more importantly, student attitude toward quantitative reasoning improves, in…

  7. Quantitative Courses in a Liberal Education Program: A Case Study

    Science.gov (United States)

    Wismath, Shelly L.; Mackay, D. Bruce

    2012-01-01

    This essay argues for the importance of quantitative reasoning skills as part of a liberal education and describes the successful introduction of a mathematics-based quantitative skills course at a small Canadian university. Today's students need quantitative problem-solving skills, to function as adults, professionals, consumers, and citizens in…

  8. 1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology

    Science.gov (United States)

    Bray Speth, Elena; Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret…

  9. Quantitative Morphological and Biochemical Studies on Human Downy Hairs using 3-D Quantitative Phase Imaging

    CERN Document Server

    Lee, SangYun; Lee, Yuhyun; Park, Sungjin; Shin, Heejae; Yang, Jongwon; Ko, Kwanhong; Park, HyunJoo; Park, YongKeun

    2015-01-01

    This study presents the morphological and biochemical findings on human downy arm hairs using 3-D quantitative phase imaging techniques. 3-D refractive index tomograms and high-resolution 2-D synthetic aperture images of individual downy arm hairs were measured using a Mach-Zehnder laser interferometric microscopy equipped with a two-axis galvanometer mirror. From the measured quantitative images, the biochemical and morphological parameters of downy hairs were non-invasively quantified including the mean refractive index, volume, cylinder, and effective radius of individual hairs. In addition, the effects of hydrogen peroxide on individual downy hairs were investigated.

  10. A quantitative approach to scar analysis.

    Science.gov (United States)

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. Copyright © 2011 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  11. Toward a quantitative approach to migrants integration

    Science.gov (United States)

    Barra, A.; Contucci, P.

    2010-03-01

    Migration phenomena and all the related issues, like integration of different social groups, are intrinsically complex problems since they strongly depend on several competitive mechanisms as economic factors, cultural differences and many others. By identifying a few essential assumptions, and using the statistical mechanics of complex systems, we propose a novel quantitative approach that provides a minimal theory for those phenomena. We show that the competitive interactions in decision making between a population of N host citizens and P immigrants, a bi-partite spin-glass, give rise to a social consciousness inside the host community in the sense of the associative memory of neural networks. The theory leads to a natural quantitative definition of migrant's "integration" inside the community. From the technical point of view this minimal picture assumes, as control parameters, only general notions like the strength of the random interactions, the ratio between the sizes of the two parties and the cultural influence. Few steps forward, toward more refined models, which include a digression on the kind of the felt experiences and some structure on the random interaction topology (as dilution to avoid the plain mean-field approach) and correlations of experiences felt between the two parties (biasing the distribution of the coupling) are discussed at the end, where we show the robustness of our approach.

  12. Quantitative tomographic measurements of opaque multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN; O' HERN,TIMOTHY J.; CECCIO,STEVEN L.

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDT and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.

  13. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  14. Quantitative fMRI and oxidative neuroenergetics.

    Science.gov (United States)

    Hyder, Fahmeed; Rothman, Douglas L

    2012-08-15

    The discovery of functional magnetic resonance imaging (fMRI) has greatly impacted neuroscience. The blood oxygenation level-dependent (BOLD) signal, using deoxyhemoglobin as an endogenous paramagnetic contrast agent, exposes regions of interest in task-based and resting-state paradigms. However the BOLD contrast is at best a partial measure of neuronal activity, because the functional maps obtained by differencing or correlations ignore the total neuronal activity in the baseline state. Here we describe how studies of brain energy metabolism at Yale, especially with (13)C magnetic resonance spectroscopy and related techniques, contributed to development of quantitative functional brain imaging with fMRI by providing a reliable measurement of baseline energy. This narrative takes us on a journey, from molecules to mind, with illuminating insights about neuronal-glial activities in relation to energy demand of synaptic activity. These results, along with key contributions from laboratories worldwide, comprise the energetic basis for quantitative interpretation of fMRI data. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Structural and quantitative analysis of Equisetum alkaloids.

    Science.gov (United States)

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  16. Integrated and Quantitative Proteomics of Human Tumors.

    Science.gov (United States)

    Yakkioui, Y; Temel, Y; Chevet, E; Negroni, L

    2017-01-01

    Quantitative proteomics represents a powerful approach for the comprehensive analysis of proteins expressed under defined conditions. These properties have been used to investigate the proteome of disease states, including cancer. It has become a major subject of studies to apply proteomics for biomarker and therapeutic target identification. In the last decades, technical advances in mass spectrometry have increased the capacity of protein identification and quantification. Moreover, the analysis of posttranslational modification (PTM), especially phosphorylation, has allowed large-scale identification of biological mechanisms. Even so, increasing evidence indicates that global protein quantification is often insufficient for the explanation of biology and has shown to pose challenges in identifying new and robust biomarkers. As a consequence, to improve the accuracy of the discoveries made using proteomics in human tumors, it is necessary to combine (i) robust and reproducible methods for sample preparation allowing statistical comparison, (ii) PTM analyses in addition to global proteomics for additional levels of knowledge, and (iii) use of bioinformatics for decrypting protein list. Herein, we present technical specificities for samples preparation involving isobaric tag labeling, TiO2-based phosphopeptides enrichment and hydrazyde-based glycopeptides purification as well as the key points for the quantitative analysis and interpretation of the protein lists. The method is based on our experience with tumors analysis derived from hepatocellular carcinoma, chondrosarcoma, human embryonic intervertebral disk, and chordoma experiments.

  17. Transmission-disequilibrium tests for quantitative traits

    Energy Technology Data Exchange (ETDEWEB)

    Allison, D.B. [Columbia Univ. College of Physicians and Surgeons, New York, NY (United States)

    1997-03-01

    The transmission-disequilibrium test (TDT) of Spielman et al. is a family-based linkage-disequilibrium test that offers a powerful way to test for linkage between alleles and phenotypes that is either causal (i.e., the marker locus is the disease/trait allele) or due to linkage disequilibrium. The TDT is equivalent to a randomized experiment and, therefore, is resistant to confounding. When the marker is extremely close to the disease locus or is the disease locus itself, tests such as the TDT can be far more powerful than conventional linkage tests. To date, the TDT and most other family-based association tests have been applied only to dichotomous traits. This paper develops five TDT-type tests for use with quantitative traits. These tests accommodate either unselected sampling or sampling based on selection of phenotypically extreme offspring. Power calculations are provided and show that, when a candidate gene is available (1) these TDT-type tests are at least an order of magnitude more efficient than two common sib-pair tests of linkage; (2) extreme sampling results in substantial increases in power; and (3) if the most extreme 20% of the phenotypic distribution is selectively sampled, across a wide variety of plausible genetic models, quantitative-trait loci explaining as little as 5% of the phenotypic variation can be detected at the .0001 a level with <300 observations. 57 refs., 2 figs., 5 tabs.

  18. A quantitative approach to painting styles

    Science.gov (United States)

    Vieira, Vilson; Fabbri, Renato; Sbrissa, David; da Fontoura Costa, Luciano; Travieso, Gonzalo

    2015-01-01

    This research extends a method previously applied to music and philosophy (Vilson Vieira et al., 2012), representing the evolution of art as a time-series where relations like dialectics are measured quantitatively. For that, a corpus of paintings of 12 well-known artists from baroque and modern art is analyzed. A set of 99 features is extracted and the features which most contributed to the classification of painters are selected. The projection space obtained provides the basis to the analysis of measurements. These quantitative measures underlie revealing observations about the evolution of painting styles, specially when compared with other humanity fields already analyzed: while music evolved along a master-apprentice tradition (high dialectics) and philosophy by opposition, painting presents another pattern: constant increasing skewness, low opposition between members of the same movement and opposition peaks in the transition between movements. Differences between baroque and modern movements are also observed in the projected "painting space": while baroque paintings are presented as an overlapped cluster, the modern paintings present minor overlapping and are disposed more widely in the projection than the baroque counterparts. This finding suggests that baroque painters shared aesthetics while modern painters tend to "break rules" and develop their own style.

  19. Building a Database for a Quantitative Model

    Science.gov (United States)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  20. Review paper. Quantitative methods in neuropathology.

    Science.gov (United States)

    Armstrong, Richard A

    2010-01-01

    The last decade has seen a considerable increase in the application of quantitative methods in the study of histological sections of brain tissue and especially in the study of neurodegenerative disease. These disorders are characterised by the deposition and aggregation of abnormal or misfolded proteins in the form of extracellular protein deposits such as senile plaques (SP) and intracellular inclusions such as neurofibrillary tangles (NFT). Quantification of brain lesions and studying the relationships between lesions and normal anatomical features of the brain, including neurons, glial cells, and blood vessels, has become an important method of elucidating disease pathogenesis. This review describes methods for quantifying the abundance of a histological feature such as density, frequency, and 'load' and the sampling methods by which quantitative measures can be obtained including plot/quadrant sampling, transect sampling, and the point-quarter method. In addition, methods for determining the spatial pattern of a histological feature, i.e., whether the feature is distributed at random, regularly, or is aggregated into clusters, are described. These methods include the use of the Poisson and binomial distributions, pattern analysis by regression, Fourier analysis, and methods based on mapped point patterns. Finally, the statistical methods available for studying the degree of spatial correlation between pathological lesions and neurons, glial cells, and blood vessels are described.

  1. Allometric trajectories and "stress": a quantitative approach

    Directory of Open Access Journals (Sweden)

    Tommaso Anfodillo

    2016-11-01

    Full Text Available The term stress is an important but vague term in plant biology. We show situations in which thinking in terms of stress is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between source and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, too little leaf area (e.g. due to herbivory or disease per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to stress, without need for recourse to this term. Our approach contrasts with traditional approaches for studying stress, e.g. revealing that small stressed plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as stress, plasticity, adaptation, and acclimation.

  2. Head-to-head comparison of quantitative and semi-quantitative ultrasound scoring systems for rheumatoid arthritis

    DEFF Research Database (Denmark)

    Terslev, Lene; Ellegaard, Karen; Christensen, Robin;

    2012-01-01

    To evaluate the reliability and agreement of semi-quantitative scoring (SQS) and quantitative scoring (QS) systems. To compare the two types of scoring system and investigate the construct validity for both scoring systems.......To evaluate the reliability and agreement of semi-quantitative scoring (SQS) and quantitative scoring (QS) systems. To compare the two types of scoring system and investigate the construct validity for both scoring systems....

  3. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  4. Is Quantitative HBsAg Measurement a Reliable Substitute for HBV DNA Quantitation?

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Mahdavi

    2015-07-01

    Conclusion: There are many factors affecting the correlation between serum HBV DNA copy number and HBsAg level such as genotype of HBV virus, phase of infection, methods of measurement, HBeAg status, and drug and types of treatment procedures. Therefore, these factors should be considered in further studies dealing with the correlation between quantitative HBV DNA and HBsAg tests.

  5. Real time quantitative amplification detection on a microarray: towards high multiplex quantitative PCR.

    NARCIS (Netherlands)

    Pierik, A.; Moamfa, M; van Zelst, M.; Clout, D.; Stapert, H.; Dijksman, Johan Frederik; Broer, D.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  6. A toolbox for rockfall Quantitative Risk Assessment

    Science.gov (United States)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  7. Quantitative cell biology: the essential role of theory.

    Science.gov (United States)

    Howard, Jonathon

    2014-11-05

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not.

  8. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  9. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  10. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  11. Quantitative self-powered electrochromic biosensors.

    Science.gov (United States)

    Pellitero, Miguel Aller; Guimerà, Anton; Kitsara, Maria; Villa, Rosa; Rubio, Camille; Lakard, Boris; Doche, Marie-Laure; Hihn, Jean-Yves; Javier Del Campo, F

    2017-03-01

    Self-powered sensors are analytical devices able to generate their own energy, either from the sample itself or from their surroundings. The conventional approaches rely heavily on silicon-based electronics, which results in increased complexity and cost, and prevents the broader use of these smart systems. Here we show that electrochromic materials can overcome the existing limitations by simplifying device construction and avoiding the need for silicon-based electronics entirely. Electrochromic displays can be built into compact self-powered electrochemical sensors that give quantitative information readable by the naked eye, simply controlling the current path inside them through a combination of specially arranged materials. The concept is validated by a glucose biosensor coupled horizontally to a Prussian blue display designed as a distance-meter proportional to (glucose) concentration. This approach represents a breakthrough for self-powered sensors, and extends the application of electrochromic materials beyond smart windows and displays, into sensing and quantification.

  12. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  13. Quantitative Evaluation for Drawability of Sheet Metal

    Institute of Scientific and Technical Information of China (English)

    Zhiqing XIONG; Xuemei YANG

    2005-01-01

    The theoretical evaluating method is given for the drawability of the sheet with normal anisotropy. The general solution on the cracking load of deep-drawing is deduced, which is based on three kinds of hardening curve of materials most in use. The distributions of stress and strain in the deformed region and the drawing force are obtained by the numerical method. The limiting drawing ratio is calculated through computer-aided simulating test. The experiments of deep-drawing to four kinds of sheet metals express that the relative errors between the predictive and the experimental results about the cracking load and the limiting drawing ratio are within 5%. The drawability of common sheet metals can be quantitatively evaluated in precise terms by means of the theory and the method advanced in this paper.

  14. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  15. Quantitative analysis of spirality in elliptical galaxies

    CERN Document Server

    Dojcsak, Levente

    2013-01-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  16. Imaging Performance of Quantitative Transmission Ultrasound

    Directory of Open Access Journals (Sweden)

    Mark W. Lenox

    2015-01-01

    Full Text Available Quantitative Transmission Ultrasound (QTUS is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound.

  17. Quantitative Accelerated Life Testing of MEMS Accelerometers

    Directory of Open Access Journals (Sweden)

    Jean-Paul Collette

    2007-11-01

    Full Text Available Quantitative Accelerated Life Testing (QALT is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS. A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the“worst case” being smaller than 10-7h-1.

  18. Proteinase K improves quantitative acylation studies.

    Science.gov (United States)

    Fränzel, Benjamin; Fischer, Frank; Steegborn, Clemens; Wolters, Dirk Andreas

    2015-01-01

    Acetylation is a common PTM of proteins but is still challenging to analyze. Only few acetylome studies have been performed to tackle this issue. Yet, the detection of acetylated proteins in complex cell lysates remains to be improved. Here, we present a proteomic approach with proteinase K as a suitable protease to identify acetylated peptides quantitatively. We first optimized the digestion conditions using an artificial system of purified bovine histones to find the optimal protease. Subsequently, the capability of proteinase K was demonstrated in complex HEK293 cell lysates. Finally, SILAC in combination with MudPIT was used to show that quantification with proteinase K is possible. In this study, we identified a sheer number of 557 unique acetylated peptides originating from 633 acetylation sites.

  19. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  20. Qualitative versus Quantitative Research in Marketing

    Directory of Open Access Journals (Sweden)

    Russel W. Belk

    2013-04-01

    Full Text Available It is ironic that at a time when wehave more quantitative data aboutconsumers than ever before – so-called “bigdata,” scanner data, loyalty programpurchase histories, trails of Internet searchesand social media activity, and much more –that businesses nevertheless increasinglydesire qualitative information. I am nottalking about the ubiquitous focus groupswhich have long been a staple in exploratorymarketing and advertising research. Suchfocus groups continue to be used, but theyare one of the weakest techniques inqualitative research. The sort of qualitativeconsumer research I have in mind insteadincludes ethnography, netnography, videography,qualitative data mining, projectivemethods, various types of observation, anddepth interviews. What these techniquespotentially provide is more naturalistic insight into how individual consumers andgroups of consumers behave in everyday life.

  1. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  2. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  3. Problems inherent to quantitative thermographic electrical inspections

    Science.gov (United States)

    Snell, John R., Jr.

    1995-03-01

    The primary value of using infrared thermography to inspect electrical systems is to find problems made apparent by their thermal differences. Thermographers have also begun collecting radiometric temperature data as quantitative imaging systems have become more reliable and portable. Throughout the industry the use of temperature data has become a primary means of prioritizing the severity of a problem. The validity of this premise is suspect for several reasons, including the lack of standard data collection methods; the often poor understanding of radiometric measurements by maintenance thermographers; field conditions that vary widely enough to defy standardization; and the almost total lack of scientific research on the relationship between heat and time with regard to the failure of the components being inspected. Several possible solutions to the problems raised, as well as other suggestions for improving the usefulness and reliability of qualitative inspections, are offered.

  4. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David;

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... on three fragrance ingredients in order to provide a practical assessment of the predictive value of the QRA approach. It is important to have data to assess that the methodology provides a robust approach for primary prevention of contact sensitization induction for fragrance ingredients identified...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  5. Quantitative relations between corruption and economic factors

    CERN Document Server

    Shao, Jia; Podobnik, Boris; Stanley, H Eugene

    2007-01-01

    We report quantitative relations between corruption level and economic factors, such as country wealth and foreign investment per capita, which are characterized by a power law spanning multiple scales of wealth and investments per capita. These relations hold for diverse countries, and also remain stable over different time periods. We also observe a negative correlation between level of corruption and long-term economic growth. We find similar results for two independent indices of corruption, suggesting that the relation between corruption and wealth does not depend on the specific measure of corruption. The functional relations we report have implications when assessing the relative level of corruption for two countries with comparable wealth, and for quantifying the impact of corruption on economic growth and foreign investments.

  6. Establishing the Quantitative Thinking Program at Macalester

    Directory of Open Access Journals (Sweden)

    David Bressoud

    2009-01-01

    Full Text Available In November 2005, the faculty of Macalester College voted to institute a graduation requirement in Quantitative Thinking (QT that is truly interdisciplinary. It currently draws on courses from thirteen departments including Anthropology, Economics, Geography, Political Science, Theater, Mathematics, Environmental Science, and Geology. This article describes the process that led to the creation of this program. It explains how we were able to get broad buy-in at the beginning and the long process of trial and error—informed by formative assessment—that was needed to refine the initial vision and shape it into a viable program that would be accepted by most of our faculty. The article concludes with a description of the program as it now exists, a discussion of our ongoing assessment of the program and its effectiveness, and a discussion of the lessons we learned in the process.

  7. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  8. Metrology Standards for Quantitative Imaging Biomarkers.

    Science.gov (United States)

    Sullivan, Daniel C; Obuchowski, Nancy A; Kessler, Larry G; Raunig, David L; Gatsonis, Constantine; Huang, Erich P; Kondratovich, Marina; McShane, Lisa M; Reeves, Anthony P; Barboriak, Daniel P; Guimaraes, Alexander R; Wahl, Richard L

    2015-12-01

    Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies.

  9. Building the next generation of quantitative biologists.

    Science.gov (United States)

    Pattin, Kristine A; Greene, Anna C; Altman, Russ B; Hunter, Lawrence E; Ross, David A; Foster, James A; Moore, Jason H

    2014-01-01

    Many colleges and universities across the globe now offer bachelors, masters, and doctoral degrees, along with certificate programs in bioinformatics. While there is some consensus surrounding curricula competencies, programs vary greatly in their core foci, with some leaning heavily toward the biological sciences and others toward quantitative areas. This allows prospective students to choose a program that best fits their interests and career goals. In the digital age, most scientific fields are facing an enormous growth of data, and as a consequence, the goals and challenges of bioinformatics are rapidly changing; this requires that bioinformatics education also change. In this workshop, we seek to ascertain current trends in bioinformatics education by asking the question, "What are the core competencies all bioinformaticians should have at the end of their training, and how successful have programs been in placing students in desired careers?"

  10. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics.

  11. Combined qualitative and quantitative research designs.

    Science.gov (United States)

    Seymour, Jane

    2012-12-01

    Mixed methods research designs have been recognized as important in addressing complexity and are recommended particularly in the development and evaluation of complex interventions. This article reports a review of studies in palliative care published between 2010 and March 2012 that combine qualitative and quantitative approaches. A synthesis of approaches to mixed methods research taken in 28 examples of published research studies of relevance to palliative and supportive care is provided, using a typology based on a classic categorization put forward in 1992. Mixed-method studies are becoming more frequently employed in palliative care research and resonate with the complexity of the palliative care endeavour. Undertaking mixed methods research requires a sophisticated understanding of the research process and recognition of some of the underlying complexities encountered when working with different traditions and perspectives on issues of: sampling, validity, reliability and rigour, different sources of data and different data collection and analysis techniques.

  12. Interacting personalities: behavioural ecology meets quantitative genetics.

    Science.gov (United States)

    Dingemanse, Niels J; Araya-Ajoy, Yimen G

    2015-02-01

    Behavioural ecologists increasingly study behavioural variation within and among individuals in conjunction, thereby integrating research on phenotypic plasticity and animal personality within a single adaptive framework. Interactions between individuals (cf. social environments) constitute a major causative factor of behavioural variation at both of these hierarchical levels. Social interactions give rise to complex 'interactive phenotypes' and group-level emergent properties. This type of phenotype has intriguing evolutionary implications, warranting a cohesive framework for its study. We detail here how a reaction-norm framework might be applied to usefully integrate social environment theory developed in behavioural ecology and quantitative genetics. The proposed emergent framework facilitates firm integration of social environments in adaptive research on phenotypic characters that vary within and among individuals.

  13. Quantitative Testing of Defect for Gun Barrels

    Institute of Scientific and Technical Information of China (English)

    WANG Chang-long; JI Feng-zhu; WANG Jin; CHEN Zheng-ge

    2007-01-01

    The magnetic flux leakage (MFL) method is commonly used in the nondestructive evaluation (NDE) of gun barrels. The key point of MFL testing is to estimate the crack geometry parameters based on the measured signal. The analysis of magnetic leakage fields can be obtained by solving Maxwell's equations using finite element method (FEM).The radial component of magnetic flux density is measured in MFL testing. The peak-peak value, the separation distance between positive and negative peaks of signal and the lift-off value of Hall-sensor are used as the main features of every sample. This paper establishes the multi-regression equations related to the width (the depth) of crack and the main characteristic values. The regression model is tested by use of the magnetic leakage data. The experimental results indicate that the regression equations can accurately predict the 2-D defect geometry parameters and the MFL quantitative testing can be achieved.

  14. Toward Accurate and Quantitative Comparative Metagenomics

    Science.gov (United States)

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  15. Quantitative metric theory of continued fractions

    Indian Academy of Sciences (India)

    J Hančl; A Haddley; P Lertchoosakul; R Nair

    2016-05-01

    Quantitative versions of the central results of the metric theory of continued fractions were given primarily by C. De Vroedt. In this paper we give improvements of the bounds involved . For a real number , let $$x=c_0+\\dfrac{1}{c_1+\\dfrac{1}{c_2+\\dfrac{1}{c_3+\\dfrac{1}{c_4+_\\ddots}}}}.$$ A sample result we prove is that given $\\epsilon > 0$, $$(c_1(x)\\cdots c_n(x))^{\\frac{1}{n}}=\\prod^\\infty_{k=1}\\left( 1+\\frac{1}{k(k+2)}\\right)^{\\frac{\\log \\, k}{\\log \\, 2}}+o\\left(n^{-\\frac{1}{2}}(\\log \\, n)^{\\frac{3}{2}}(\\log \\, \\log \\, n)^{\\frac{1}{2}+\\epsilon}\\right)$$

  16. The Quantitative Decision on Conflicting Objective

    Institute of Scientific and Technical Information of China (English)

    LIU Wei; ZHOU Yue-mei

    2004-01-01

    In view of the interconflict of multi-objectives in large decision systems, the problem that ranks all the objectives according to several standards is posed. Quantifing the value of decision for objectives according to the traits of researched systems,with the frequency distribution, we construct quantitative indices and then build up the comparative relation of superior and inferior between any two decisions and introduce the relation function which reflects not only the orientation of order but also the extent of superior and inferior for decisions. With calculating the weight coefficient, we put forward several comparative coefficients and finish off the order of superior and inferior of all the decisions for several conflict objectives. By positivist analyses, the solution shows its precise and concise traits dealing with complex decisions.

  17. Quantitative bioluminescence imaging of mouse tumor models.

    Science.gov (United States)

    Tseng, Jen-Chieh; Kung, Andrew L

    2015-01-05

    Bioluminescence imaging (BLI) has become an essential technique for preclinical evaluation of anticancer therapeutics and provides sensitive and quantitative measurements of tumor burden in experimental cancer models. For light generation, a vector encoding firefly luciferase is introduced into human cancer cells that are grown as tumor xenografts in immunocompromised hosts, and the enzyme substrate luciferin is injected into the host. Alternatively, the reporter gene can be expressed in genetically engineered mouse models to determine the onset and progression of disease. In addition to expression of an ectopic luciferase enzyme, bioluminescence requires oxygen and ATP, thus only viable luciferase-expressing cells or tissues are capable of producing bioluminescence signals. Here, we summarize a BLI protocol that takes advantage of advances in hardware, especially the cooled charge-coupled device camera, to enable detection of bioluminescence in living animals with high sensitivity and a large dynamic range.

  18. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  19. Significant new quantitative EGG patterns in fibromyalgia

    Directory of Open Access Journals (Sweden)

    Jorge Navarro López

    2015-12-01

    Full Text Available Background and Objectives: We analyzed the EEG recordings of a sample of fibromyalgia patients, with the goal of looking for new, more objective indicators on the diagnosis and severity assessment of this pathology, and looking also to establish the relationship of these new indicators with different psychological and neuropsychiatric tests. Methods: We compared the EEG recordings of a group of 13 fibromyalgia patients with a normalized database built into the software of the equipment used (Neuronic, and also with a control group of 13 individuals; both groups were selected under the same criteria of inclusion-exclusion. Patients and controls underwent quantitative EEG (eyes closed, according to international 10-20 EEG system and were specifically evaluated throughout various neuropsychiatric and psychological questionnaires. Results: We obtained the absolute powers of QEEG (quantitative for the different electrode sites and frequency bands, we determined the corresponding values of the deviation from normal (Z-scores, and estimated various indicators and ratios, as well as correlations with the results of psychological tests. Interestingly, the ratios of theta and beta frequencies in relation with alpha appear as one of the most relevant indicators of the severity of the pathology; significant differences were also found in the peak frequency (maximum power per Hz of the alpha band, and in the frequency peak of the total spectrum. Conclusions: The consistency of the abnormal EEG patterns of fibromyalgia patients revealed the presence of systemic dysfunction at the central nervous system level, beyond possible peripheral anomalies and specific tissue pathologies. Among the indicators and benchmarks achieved, the most important changes concern the frequencies theta, alpha and beta, and still more significant were the values of their ratios in the comparison between patients and controls. The relative values of peak frequencies are also of

  20. Quantitative laryngeal electromyography: turns and amplitude analysis.

    Science.gov (United States)

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  1. Quantitative proteomics reveals cellular targets of celastrol.

    Directory of Open Access Journals (Sweden)

    Jakob Hansen

    Full Text Available Celastrol, a natural substance isolated from plant extracts used in traditional Chinese medicine, has been extensively investigated as a possible drug for treatment of cancer, autoimmune diseases, and protein misfolding disorders. Although studies focusing on celastrol's effects in specific cellular pathways have revealed a considerable number of targets in a diverse array of in vitro models there is an essential need for investigations that can provide a global view of its effects. To assess cellular effects of celastrol and to identify target proteins as biomarkers for monitoring treatment regimes, we performed large-scale quantitative proteomics in cultured human lymphoblastoid cells, a cell type that can be readily prepared from human blood samples. Celastrol substantially modified the proteome composition and 158 of the close to 1800 proteins with robust quantitation showed at least a 1.5 fold change in protein levels. Up-regulated proteins play key roles in cytoprotection with a prominent group involved in quality control and processing of proteins traversing the endoplasmic reticulum. Increased levels of proteins essential for the cellular protection against oxidative stress including heme oxygenase 1, several peroxiredoxins and thioredoxins as well as proteins involved in the control of iron homeostasis were also observed. Specific analysis of the mitochondrial proteome strongly indicated that the mitochondrial association of certain antioxidant defense and apoptosis-regulating proteins increased in cells exposed to celastrol. Analysis of selected mRNA transcripts showed that celastrol activated several different stress response pathways and dose response studies furthermore showed that continuous exposure to sub-micromolar concentrations of celastrol is associated with reduced cellular viability and proliferation. The extensive catalog of regulated proteins presented here identifies numerous cellular effects of celastrol and constitutes

  2. Evolutionary Quantitative Genomics of Populus trichocarpa.

    Science.gov (United States)

    Porth, Ilga; Klápště, Jaroslav; McKown, Athena D; La Mantia, Jonathan; Guy, Robert D; Ingvarsson, Pär K; Hamelin, Richard; Mansfield, Shawn D; Ehlting, Jürgen; Douglas, Carl J; El-Kassaby, Yousry A

    2015-01-01

    Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood) genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance) was investigated for signatures of selection (comparing QST-FST) using clustering of individuals by climate of origin (temperature and precipitation). 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages) across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation); 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes) were associated with adaptive traits (based on significant QST). Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show relationship to climate

  3. Quantitative assessment model for gastric cancer screening

    Institute of Scientific and Technical Information of China (English)

    Kun Chen; Wei-Ping Yu; Liang Song; Yi-Min Zhu

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer.METHODS: A case control study was carried on in 66patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food,etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD).RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively.According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%.Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05).CONCLUSION: The validity of this method is satisfactory.It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer.

  4. Quality control for quantitative geophysical logging

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Kyu; Hwang, Se Ho; Hwang, Hak Soo; Park, In Hwa [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    Despite the great availability of geophysical data obtained from boreholes, the interpretation is subject to significant uncertainties. More accurate data with less statistical uncertainties should require an employment of more quantitative techniques in log acquisition and interpretation technique. The long-term objective of this project is the development of techniques in both quality control of log measurement and the quantitative interpretation. In the first year, the goals of the project will include establishing the procedure of log acquisition using various tests, analysing the effect of logging velocity change on the logging data, examining the repeatability and reproducibility, analyzing of filtering effect on the log measurements, and finally the zonation and the correlation of single-and inter-well log data. For the establishment of logging procedure, we have tested the multiple factors affecting the accuracy in depth. The factors are divided into two parts: human and mechanical. These factors include the zero setting of depth, the calculation of offset for the sonde, the stretching effect of cable, and measuring wheel accuracy. We conclude that the error in depth setting results primarily from human factor, and also in part from the stretching of cable. The statistical fluctuation of log measurements increases according to increasing the logging speed for the zone of lower natural gamma. Thus, the problem related with logging speed is a trifling matter in case of the application of resources exploration, the logging speed should run more slowly to reduce the statistical fluctuation of natural gamma with lithologic correlation in mind. The repeatability and reproducibility of logging measurements are tested. The results of repeatability test for the natural gamma sonde are qualitatively acceptable in the reproducibility test, the errors occurs in logging data between two operators and successive trials. We conclude that the errors result from the

  5. Towards a quantitative OCT image analysis.

    Directory of Open Access Journals (Sweden)

    Marina Garcia Garrido

    Full Text Available Optical coherence tomography (OCT is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study.Spectral-Domain Optical Coherence Tomography (OCT, confocal Scanning-Laser Ophthalmoscopy (SLO, and Fluorescein Angiography (FA were performed in mice (Mus musculus, gerbils (Gerbillus perpadillus, and cynomolgus monkeys (Macaca fascicularis using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer. Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/.Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP, the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated.OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions. Our results highlight the

  6. Quantitative Wood Anatomy-Practical Guidelines.

    Science.gov (United States)

    von Arx, Georg; Crivellaro, Alan; Prendin, Angela L; Čufar, Katarina; Carrer, Marco

    2016-01-01

    Quantitative wood anatomy analyzes the variability of xylem anatomical features in trees, shrubs, and herbaceous species to address research questions related to plant functioning, growth, and environment. Among the more frequently considered anatomical features are lumen dimensions and wall thickness of conducting cells, fibers, and several ray properties. The structural properties of each xylem anatomical feature are mostly fixed once they are formed, and define to a large extent its functionality, including transport and storage of water, nutrients, sugars, and hormones, and providing mechanical support. The anatomical features can often be localized within an annual growth ring, which allows to establish intra-annual past and present structure-function relationships and its sensitivity to environmental variability. However, there are many methodological challenges to handle when aiming at producing (large) data sets of xylem anatomical data. Here we describe the different steps from wood sample collection to xylem anatomical data, provide guidance and identify pitfalls, and present different image-analysis tools for the quantification of anatomical features, in particular conducting cells. We show that each data production step from sample collection in the field, microslide preparation in the lab, image capturing through an optical microscope and image analysis with specific tools can readily introduce measurement errors between 5 and 30% and more, whereby the magnitude usually increases the smaller the anatomical features. Such measurement errors-if not avoided or corrected-may make it impossible to extract meaningful xylem anatomical data in light of the rather small range of variability in many anatomical features as observed, for example, within time series of individual plants. Following a rigid protocol and quality control as proposed in this paper is thus mandatory to use quantitative data of xylem anatomical features as a powerful source for many

  7. Quantitative Wood Anatomy—Practical Guidelines

    Science.gov (United States)

    von Arx, Georg; Crivellaro, Alan; Prendin, Angela L.; Čufar, Katarina; Carrer, Marco

    2016-01-01

    Quantitative wood anatomy analyzes the variability of xylem anatomical features in trees, shrubs, and herbaceous species to address research questions related to plant functioning, growth, and environment. Among the more frequently considered anatomical features are lumen dimensions and wall thickness of conducting cells, fibers, and several ray properties. The structural properties of each xylem anatomical feature are mostly fixed once they are formed, and define to a large extent its functionality, including transport and storage of water, nutrients, sugars, and hormones, and providing mechanical support. The anatomical features can often be localized within an annual growth ring, which allows to establish intra-annual past and present structure-function relationships and its sensitivity to environmental variability. However, there are many methodological challenges to handle when aiming at producing (large) data sets of xylem anatomical data. Here we describe the different steps from wood sample collection to xylem anatomical data, provide guidance and identify pitfalls, and present different image-analysis tools for the quantification of anatomical features, in particular conducting cells. We show that each data production step from sample collection in the field, microslide preparation in the lab, image capturing through an optical microscope and image analysis with specific tools can readily introduce measurement errors between 5 and 30% and more, whereby the magnitude usually increases the smaller the anatomical features. Such measurement errors—if not avoided or corrected—may make it impossible to extract meaningful xylem anatomical data in light of the rather small range of variability in many anatomical features as observed, for example, within time series of individual plants. Following a rigid protocol and quality control as proposed in this paper is thus mandatory to use quantitative data of xylem anatomical features as a powerful source for many

  8. Quantitative Susceptibility Mapping in Parkinson's Disease

    Science.gov (United States)

    Seiler, Stephan; Deistung, Andreas; Schweser, Ferdinand; Franthal, Sebastian; Homayoon, Nina; Katschnig-Winter, Petra; Koegl-Wallner, Mariella; Pendl, Tamara; Stoegerer, Eva Maria; Wenzel, Karoline; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen Rainer; Schmidt, Reinhold; Schwingenschuh, Petra

    2016-01-01

    Background Quantitative susceptibility mapping (QSM) and R2* relaxation rate mapping have demonstrated increased iron deposition in the substantia nigra of patients with idiopathic Parkinson’s disease (PD). However, the findings in other subcortical deep gray matter nuclei are converse and the sensitivity of QSM and R2* for morphological changes and their relation to clinical measures of disease severity has so far been investigated only sparsely. Methods The local ethics committee approved this study and all subjects gave written informed consent. 66 patients with idiopathic Parkinson’s disease and 58 control subjects underwent quantitative MRI at 3T. Susceptibility and R2* maps were reconstructed from a spoiled multi-echo 3D gradient echo sequence. Mean susceptibilities and R2* rates were measured in subcortical deep gray matter nuclei and compared between patients with PD and controls as well as related to clinical variables. Results Compared to control subjects, patients with PD had increased R2* values in the substantia nigra. QSM also showed higher susceptibilities in patients with PD in substantia nigra, in the nucleus ruber, thalamus, and globus pallidus. Magnetic susceptibility of several of these structures was correlated with the levodopa-equivalent daily dose (LEDD) and clinical markers of motor and non-motor disease severity (total MDS-UPDRS, MDS-UPDRS-I and II). Disease severity as assessed by the Hoehn & Yahr scale was correlated with magnetic susceptibility in the substantia nigra. Conclusion The established finding of higher R2* rates in the substantia nigra was extended by QSM showing superior sensitivity for PD-related tissue changes in nigrostriatal dopaminergic pathways. QSM additionally reflected the levodopa-dosage and disease severity. These results suggest a more widespread pathologic involvement and QSM as a novel means for its investigation, more sensitive than current MRI techniques. PMID:27598250

  9. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  10. Quantitative measurement of resist outgassing during exposure

    Science.gov (United States)

    Maxim, Nicolae; Houle, Frances A.; Huijbregtse, Jeroen; Deline, Vaughn R.; Truong, Hoa; van Schaik, Willem

    2009-03-01

    Determination of both the identity and quantity of species desorbing from photoresists during exposure at any wavelength - 248nm, 193nm and EUV - has proved to be very challenging, adding considerable uncertainty to the evaluation of risks posed by specific photoresists to exposure tool optics. Measurements using a variety of techniques for gas detection and solid film analysis have been reported but analytical results have not in general been easy to compare or even in apparent agreement, in part due to difficulties in establishing absolute calibrations. In this work we describe two measurement methods that can be used for any exposure wavelength, and show that they provide self-consistent quantitative outgassing data for 2 all-organic and 2 Si-containing 193 nm resists. The first method, based upon gas collection, uses two primary chromatographic techniques. Organic products containing C, S and Si are determined by collection of vapors emitted during exposure in a cold trap and analysis by Gas Chromatography-Flame Ionization Detector-Pulsed Flame Photometric Detector-Mass Spectrometry (GC-FID-PFPD-MS). Inorganic products such as SO2 are identified by adsorbent bed with analysis by Gas Particle-Ion Chromatography (GP-IC). The calibration procedure used provides reasonable accuracy without exhaustive effort. The second method analyzes the elemental concentrations in resist films before and after exposure by secondary ion mass spectrometry technique (SIMS), which requires only knowledge of the resist compositions to be quantitative. The extent of outgassing of C and S determined by the two methods is in good agreement for all 4 resists, especially when taking their fundamentally different characters into account. Overall, the gas collection techniques yielded systematically lower outgassing numbers than did SIMS, and the origins of the spread in values, which likely bracket the true values, as well as detection limits will be discussed. The data for Si were found to

  11. Evolutionary Quantitative Genomics of Populus trichocarpa.

    Directory of Open Access Journals (Sweden)

    Ilga Porth

    Full Text Available Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance was investigated for signatures of selection (comparing QST-FST using clustering of individuals by climate of origin (temperature and precipitation. 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation; 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes were associated with adaptive traits (based on significant QST. Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show

  12. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  13. Treatment assessment of radiotherapy using MR functional quantitative imaging

    Institute of Scientific and Technical Information of China (English)

    Zheng; Chang; Chunhao; Wang

    2015-01-01

    Recent developments in magnetic resonance(MR) functional quantitative imaging have made it a potentially powerful tool to assess treatment response in radiation therapy. With its abilities to capture functional information on underlying tissue characteristics, MR functional quantitative imaging can be valuable in assessing treatment response and as such to optimize therapeutic outcome. Various MR quantitative imaging techniques, including diffusion weighted imaging, diffusion tensor imaging, MR spectroscopy and dynamic contrastenhanced imaging, have been investigated and found useful for assessment of radiotherapy. However, various aspects including data reproducibility, interpretation of biomarkers, image quality and data analysis impose challenges on applications of MR functional quantitative imaging in radiotherapy assessment. All of these challenging issues shall be addressed to help us understand whether MR functional quantitative imaging is truly beneficial and contributes to future development of radiotherapy. It is evident that individualized therapy is the future direction of patient care. MR functional quantitative imaging might serves as an indispensable tool towards this promising direction.

  14. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  15. What's in a Name? A Critical Review of Definitions of Quantitative Literacy, Numeracy, and Quantitative Reasoning

    Directory of Open Access Journals (Sweden)

    Gizen Karaali

    2016-01-01

    Full Text Available This article aims to bring together various threads in the eclectic literature that make up the scholarship around the theme of Quantitative Literacy. In investigating the meanings of terms like "quantitative literacy," "quantitative reasoning," and "numeracy," we seek common ground, common themes, common goals and aspirations of a community of practitioners. A decade ago, these terms were relatively new in the public sphere; today policy makers and accrediting agencies are routinely inserting them into general education conversations. Having good, representative, and perhaps even compact and easily digestible definitions of these terms might come in handy in public relations contexts as well as in other situations where practitioners need to measure and evaluate their own success or communicate their goals and practice to others. Finding such definitions is, as expected, a difficult task. We offer through our analysis a clarifying framework for practitioners looking to sharpen their definitions and for others who are not keen on finalizing definitions. More specifically, we argue that there is indeed a common thread among all the terms involved, that of a competence in interacting with myriad mathematical and statistical representations of the real world, in the contexts of daily life, work situations, and the civic engagement. Furthermore we propose that the knowledge content captured by the individual terms can be placed on a continuum (statistics-data-arithmetic-mathematics-logic.

  16. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  17. The quantitative genetics of phenotypic robustness.

    Directory of Open Access Journals (Sweden)

    Hunter B Fraser

    Full Text Available Phenotypic robustness, or canalization, has been extensively investigated both experimentally and theoretically. However, it remains unknown to what extent robustness varies between individuals, and whether factors buffering environmental variation also buffer genetic variation. Here we introduce a quantitative genetic approach to these issues, and apply this approach to data from three species. In mice, we find suggestive evidence that for hundreds of gene expression traits, robustness is polymorphic and can be genetically mapped to discrete genomic loci. Moreover, we find that the polymorphisms buffering genetic variation are distinct from those buffering environmental variation. In fact, these two classes have quite distinct mechanistic bases: environmental buffers of gene expression are predominantly sex-specific and trans-acting, whereas genetic buffers are not sex-specific and often cis-acting. Data from studies of morphological and life-history traits in plants and yeast support the distinction between polymorphisms buffering genetic and environmental variation, and further suggest that loci buffering different types of environmental variation do overlap with one another. These preliminary results suggest that naturally occurring polymorphisms affecting phenotypic robustness could be abundant, and that these polymorphisms may generally buffer either genetic or environmental variation, but not both.

  18. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  19. Quantitative evaluation of gait ataxia by accelerometers.

    Science.gov (United States)

    Shirai, Shinichi; Yabe, Ichiro; Matsushima, Masaaki; Ito, Yoichi M; Yoneyama, Mitsuru; Sasaki, Hidenao

    2015-11-15

    An appropriate biomarker for spinocerebellar degeneration (SCD) has not been identified. Here, we performed gait analysis on patients with pure cerebellar type SCD and assessed whether the obtained data could be used as a neurophysiological biomarker for cerebellar ataxia. We analyzed 25 SCD patients, 25 patients with Parkinson's disease as a disease control, and 25 healthy control individuals. Acceleration signals during 6 min of walking and 1 min of standing were measured by two sets of triaxial accelerometers that were secured with a fixation vest to the middle of the lower and upper back of each subject. We extracted two gait parameters, the average and the coefficient of variation of motion trajectory amplitude, from each acceleration component. Then, each component was analyzed by correlation with the Scale for the Assessment and Rating of Ataxia (SARA) and the Berg Balance Scale (BBS). Compared with the gait control of healthy subjects and concerning correlation with severity and disease specificity, our results suggest that the average amplitude of medial-lateral (upper back) of straight gait is a physiological biomarker for cerebellar ataxia. Our results suggest that gait analysis is a quantitative and concise evaluation scale for the severity of cerebellar ataxia.

  20. Quantitation of vitamin K in human milk

    Energy Technology Data Exchange (ETDEWEB)

    Canfield, L.M.; Hopkinson, J.M.; Lima, A.F.; Martin, G.S.; Sugimoto, K.; Burr, J.; Clark, L.; McGee, D.L. (Univ. of Arizona, Tucson (USA))

    1990-07-01

    A quantitative method was developed for the assay of vitamin K in human colostrum and milk. The procedure combines preparative and analytical chromatography on silica gel in a nitrogen atmosphere followed by reversed phase high performance liquid chromatography (HPLC). Two HPLC steps were used: gradient separation with ultraviolet (UV) detection followed by isocratic separation detected electrochemically. Due to co-migrating impurities, UV detection alone is insufficient for identification of vitamin K. Exogenous vitamin K was shown to equilibrate with endogenous vitamin K in the samples. A statistical method was incorporated to control for experimental variability. Vitamin K1 was analyzed in 16 pooled milk samples from 7 donors and in individual samples from 15 donors at 1 month post-partum. Vitamin K1 was present at 2.94 +/- 1.94 and 3.15 +/- 2.87 ng/mL in pools and in individuals, respectively. Menaquinones, the bacterial form of the vitamin, were not detected. The significance of experimental variation to studies of vitamin K in individuals is discussed.

  1. Quantitative NDE of Composite Structures at NASA

    Science.gov (United States)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  2. Extracting Quantitative Data from Lunar Soil Spectra

    Science.gov (United States)

    Noble, S. K.; Pieters, C. M.; Hiroi, T.

    2005-01-01

    Using the modified Gaussian model (MGM) developed by Sunshine et al. [1] we compared the spectral properties of the Lunar Soil Characterization Consortium (LSCC) suite of lunar soils [2,3] with their petrologic and chemical compositions to obtain quantitative data. Our initial work on Apollo 17 soils [4] suggested that useful compositional data could be elicited from high quality soil spectra. We are now able to expand upon those results with the full suite of LSCC soils that allows us to explore a much wider range of compositions and maturity states. The model is shown to be sensitive to pyroxene abundance and can evaluate the relative portion of high-Ca and low-Ca pyroxenes in the soils. In addition, the dataset has provided unexpected insights into the nature and causes of absorption bands in lunar soils. For example, it was found that two distinct absorption bands are required in the 1.2 m region of the spectrum. Neither of these bands can be attributed to plagioclase or agglutinates, but both appear to be largely due to pyroxene.

  3. Quantitative methods for assessing drug synergism.

    Science.gov (United States)

    Tallarida, Ronald J

    2011-11-01

    Two or more drugs that individually produce overtly similar effects will sometimes display greatly enhanced effects when given in combination. When the combined effect is greater than that predicted by their individual potencies, the combination is said to be synergistic. A synergistic interaction allows the use of lower doses of the combination constituents, a situation that may reduce adverse reactions. Drug combinations are quite common in the treatment of cancers, infections, pain, and many other diseases and situations. The determination of synergism is a quantitative pursuit that involves a rigorous demonstration that the combination effect is greater than that which is expected from the individual drug's potencies. The basis of that demonstration is the concept of dose equivalence, which is discussed here and applied to an experimental design and data analysis known as isobolographic analysis. That method, and a related method of analysis that also uses dose equivalence, are presented in this brief review, which provides the mathematical basis for assessing synergy and an optimization strategy for determining the dose combination.

  4. Quantitative tests for plate tectonics on Venus

    Science.gov (United States)

    Kaula, W. M.; Phillips, R. J.

    1981-01-01

    Quantitative comparisons are made between the characteristics of plate tectonics on the earth and those which are possible on Venus. Considerations of the factors influencing rise height and relating the decrease in rise height to plate velocity indicate that the rate of topographic dropoff from spreading centers should be about half that on earth due to greater rock-fluid density contrast and lower temperature differential between the surface and interior. Statistical analyses of Pioneer Venus radar altimetry data and global earth elevation data is used to identify 21,000 km of ridge on Venus and 33,000 km on earth, and reveal Venus ridges to have a less well-defined mode in crest heights and a greater concavity than earth ridges. Comparison of the Venus results with the spreading rates and associated heat flow on earth reveals plate creation rates on Venus to be 0.7 sq km/year or less and indicates that not more than 15% of Venus's energy is delivered to the surface by plate tectonics, in contrast to values of 2.9 sq km a year and 70% for earth.

  5. A Quantitative Scale of Oxophilicity and Thiophilicity.

    Science.gov (United States)

    Kepp, Kasper P

    2016-09-19

    Oxophilicity and thiophilicity are widely used concepts with no quantitative definition. In this paper, a simple, generic scale is developed that solves issues with reference states and system dependencies and captures empirically known tendencies toward oxygen. This enables a detailed analysis of the fundamental causes of oxophilicity. Notably, the notion that oxophilicity relates to Lewis acid hardness is invalid. Rather, oxophilicity correlates only modestly and inversely with absolute hardness and more strongly with electronegativity and effective nuclear charge. Since oxygen is highly electronegative, ionic bonding is stronger to metals of low electronegativity. Left-side d-block elements with low effective nuclear charges and electronegativities are thus highly oxophilic, and the f-block elements, not because of their hardness, which is normal, but as a result of the small ionization energies of their outermost valence electrons, can easily transfer electrons to fulfill the electron demands of oxygen. Consistent with empirical experience, the most oxophilic elements are found in the left part of the d block, the lanthanides, and the actinides. The d-block elements differ substantially in oxophilicity, quantifying their different uses in a wide range of chemical reactions; thus, the use of mixed oxo- and thiophilic (i.e., "mesophilic") surfaces and catalysts as a design principle can explain the success of many recent applications. The proposed scale may therefore help to rationalize and improve chemical reactions more effectively than current qualitative considerations of oxophilicity.

  6. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail.

    Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  7. Quantitative Proteomic Approaches for Studying Phosphotyrosine Signaling

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Shi-Jian; Qian, Weijun; Smith, Richard D.

    2007-02-01

    Protein tyrosine phosphorylation is a fundamental mechanism for controlling many aspects of cellular processes, as well as aspects of human health and diseases. Compared to phosphoserine (pSer) and phosphothreonine (pThr), phosphotyrosine (pTyr) signaling is more tightly regulated, but often more challenging to characterize due to significantly lower level of tyrosine phosphorylation (a relative abundance of 1800:200:1 was estimated for pSer/pThr/pTyr in vertebrate cells[1]). In this review, we outline the recent advances in analytical methodologies for enrichment, identification, and accurate quantitation of tyrosine phosphorylated proteins and peptides using antibody-based technologies, capillary liquid chromatography (LC) coupled with mass spectrometry (MS), and various stable isotope labeling strategies, as well as non-MS-based methods such as protein or peptide array methods. These proteomic technological advances provide powerful tools for potentially understanding signal transduction at the system level and provide a basis for discovering novel drug targets for human diseases. [1] Hunter, T. (1998) The Croonian Lecture 1997. The phosphorylation of proteins on tyrosine: its role in cell growth and disease. Philos. Trans. R. Soc. Lond. B Biol. Sci. 353, 583–605

  8. Neuropathic pain: is quantitative sensory testing helpful?

    Science.gov (United States)

    Krumova, Elena K; Geber, Christian; Westermann, Andrea; Maier, Christoph

    2012-08-01

    Neuropathic pain arises as a consequence of a lesion or disease affecting the somatosensory system and is characterised by a combination of positive and negative sensory symptoms. Quantitative sensory testing (QST) examines the sensory perception after application of different mechanical and thermal stimuli of controlled intensity and the function of both large (A-beta) and small (A-delta and C) nerve fibres, including the corresponding central pathways. QST can be used to determine detection, pain thresholds and stimulus-response curves and can thus detect both negative and positive sensory signs, the second ones not being assessed by other methods. Similarly to all other psychophysical tests QST requires standardised examination, instructions and data evaluation to receive valid and reliable results. Since normative data are available, QST can contribute also to the individual diagnosis of neuropathy, especially in the case of isolated small-fibre neuropathy, in contrast to the conventional electrophysiology which assesses only large myelinated fibres. For example, detection of early stages of subclinical neuropathy in symptomatic or asymptomatic patients with diabetes mellitus can be helpful to optimise treatment and identify diabetic foot at risk of ulceration. QST assessed the individual's sensory profile and thus can be valuable to evaluate the underlying pain mechanisms which occur in different frequencies even in the same neuropathic pain syndromes. Furthermore, assessing the exact sensory phenotype by QST might be useful in the future to identify responders to certain treatments in accordance to the underlying pain mechanisms.

  9. Quantitative analysis of protein turnover in plants.

    Science.gov (United States)

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  10. Diagnostics for conformity of paired quantitative measurements.

    Science.gov (United States)

    Hawkins, Douglas M

    2002-07-15

    Matched pairs data arise in many contexts - in case-control clinical trials, for example, and from cross-over designs. They also arise in experiments to verify the equivalence of quantitative assays. This latter use (which is the main focus of this paper) raises difficulties not always seen in other matched pairs applications. Since the designs deliberately vary the analyte levels over a wide range, issues of variance dependent on mean, calibrations of differing slopes, and curvature all need to be added to the usual model assumptions such as normality. Violations in any of these assumptions invalidate the conventional matched pairs analysis. A graphical method, due to Bland and Altman, of looking at the relationship between the average and the difference of the members of the pairs is shown to correspond to a formal testable regression model. Using standard regression diagnostics, one may detect and diagnose departures from the model assumptions and remedy them - for example using variable transformations. Examples of different common scenarios and possible approaches to handling them are shown.

  11. A LEAP Forward for Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    H.L. Vacher

    2011-07-01

    Full Text Available The Association of American College and Universities’ Learning Education and America’s Promise (LEAP initiative has identified quantitative literacy (QL as one of its Essential Learning Outcomes and classified it amongst five other Intellectual and Practical Skills such as inquiry and analysis, critical and creative thinking, and written and oral communication. This brings to mind a spreadsheet in which these transdisciplinary intellectual and practical skills are rows and academic disciplines are columns. With the view that the learning outcome QL is a row crossing mathematics and other disciplinary columns, this editorial considers how the papers in this and previous issues of Numeracy distribute into the imaginary spreadsheet. The analysis shows that papers in Numeracy have been expanding from the journal’s cell of origin, where QL crosses mathematics, as well as growing in number. The editorial closes by asking about the uniformity of principles of QL from one cell to another in the row, and whether there are levels of QL within the row as a whole. A sidebar notes that downloads are passing the 15,000 mark and the monthly rate now is about 2/3 higher than it was six months ago.

  12. Quantitative phase imaging through scattering media

    Science.gov (United States)

    Kollárová, Vera; Colláková, Jana; Dostál, Zbynek; Slabý, Tomas; Veselý, Pavel; Chmelík, Radim

    2015-03-01

    Coherence-controlled holographic microscope (CCHM) is an off-axis holographic system. It enables observation of a sample and its quantitative phase imaging with coherent as well as with incoherent illumination. The spatial and temporal coherence can be modified and thus also the quality and type of the image information. The coherent illumination provides numerical refocusing in wide depth range similarly to a classic coherent-light digital holographic microscopy (HM). Incoherent-light HM is characterized by a high quality, coherence-noise-free imaging with up to twice higher resolution compared to coherent illumination. Owing to an independent, free of sample reference arm of the CCHM the low spatial light coherence induces coherence-gating effect. This makes possible to observe specimen also through scattering media. We have described theoretically and simulated numerically imaging of a two dimensional object through a scattering layer by CCHM using the linear systems theory. We have investigated both strongly and weakly scattering media characterized by different amount of ballistic and diffuse light. The influence of a scattering layer on the quality of a phase signal is discussed for both types of the scattering media. A strong dependence of the imaging process on the light coherence is demonstrated. The theoretical calculations and numerical simulations are supported by experimental data gained with model samples, as well as real biologic objects particularly then by time-lapse observations of live cells reactions to substances producing optically turbid emulsion.

  13. Nonlinear dynamics and quantitative EEG analysis.

    Science.gov (United States)

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  14. Quantitative Spectroscopy of BA-type Supergiants

    CERN Document Server

    Przybilla, N; Becker, S R; Kudritzki, R P

    2005-01-01

    Luminous BA-SGs allow topics ranging from NLTE physics and the evolution of massive stars to the chemical evolution of galaxies and cosmology to be addressed. A hybrid NLTE technique for the quantitative spectroscopy of BA-SGs is discussed. Thorough tests and first applications of the spectrum synthesis method are presented for four bright Galactic objects. Stellar parameters are derived from spectroscopic indicators. The internal accuracy of the method allows the 1sigma-uncertainties to be reduced to <1-2% in Teff and to 0.05-0.10dex in log g. Elemental abundances are determined for over 20 chemical species, with many of the astrophysically most interesting in NLTE. The NLTE computations reduce random errors and remove systematic trends in the analysis. Inappropriate LTE analyses tend to systematically underestimate iron group abundances and overestimate the light and alpha-process element abundances by up to factors of 2-3 on the mean. Contrary to common assumptions, significant NLTE abundance correction...

  15. Controlled surface chemistries and quantitative cell response

    Science.gov (United States)

    Plant, Anne L.

    2002-03-01

    Living cells experience a large number of signaling cues from their extracellular matrix. As a result of these inputs, a variety of intracellular signaling pathways are apparently initiated simultaneously. The vast array of alternative responses that result from the integration of these inputs suggests that it may be reasonable to look for cellular response not as an 'on' or 'off' condition but as a distribution of responses. A difficult challenge is to determine whether variations in responses from individual cells arise from the complexity of intracellular signals or are due to variations in the cell culture environment. By controlling surface chemistry so that every cell 'sees' the same chemical and physical environment, we can begin to assess how the distribution of cell response is affected strictly by changes in the chemistry of the cell culture surface. Using the gene for green fluorescent protein linked to the gene for the promoter of the extracellular matrix protein, tenascin, we can easily probe the end product in a signaling pathway that is purported to be linked to surface protein chemistry and to cell shape. Cell response to well-controlled, well-characterized, and highly reproducible surfaces prepared using soft lithography techniques are compared with more conventional ways of preparing extracellular matrix proteins for cell culture. Using fluorescence microscopy and image analysis of populations of cells on these surfaces, we probe quantitatively the relationship between surface chemistry, cell shape and variations in gene expression endpoint.

  16. Competition between small RNAs: a quantitative view.

    Science.gov (United States)

    Loinger, Adiel; Shemla, Yael; Simon, Itamar; Margalit, Hanah; Biham, Ofer

    2012-04-18

    Two major classes of small regulatory RNAs--small interfering RNAs (siRNAs) and microRNA (miRNAs)--are involved in a common RNA interference processing pathway. Small RNAs within each of these families were found to compete for limiting amounts of shared components, required for their biogenesis and processing. Association with Argonaute (Ago), the catalytic component of the RNA silencing complex, was suggested as the central mechanistic point in RNA interference machinery competition. Aiming to better understand the competition between small RNAs in the cell, we present a mathematical model and characterize a range of specific cell and experimental parameters affecting the competition. We apply the model to competition between miRNAs and study the change in the expression level of their target genes under a variety of conditions. We show quantitatively that the amount of Ago and miRNAs in the cell are dominant factors contributing greatly to the competition. Interestingly, we observe what to our knowledge is a novel type of competition that takes place when Ago is abundant, by which miRNAs with shared targets compete over them. Furthermore, we use the model to examine different interaction mechanisms that might operate in establishing the miRNA-Ago complexes, mainly those related to their stability and recycling. Our model provides a mathematical framework for future studies of competition effects in regulation mechanisms involving small RNAs.

  17. Quantitative color analysis for capillaroscopy image segmentation.

    Science.gov (United States)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  18. Qualitative and quantitative EEG in psychotic children.

    Science.gov (United States)

    Itil, T M; Simeon, J; Coffin, C

    1976-05-01

    The EEGs of hospitalized psychotic boys were analyzed quantitatively by means of visual evaluation, analog frequency analysis, and digital computer period analysis and were compared with those of age- and sex-matched normals. Visual evaluation of the records demonstrated that psychotic children have significantly more beta activity as well as fewer alpha bursts than normal controls. EEG analog frequency analysis showed that psychotic children have a greater percentage of total voltage in the 3-5 cps and 13-33 cps bands, while they show less voltage in the 6-12 cps bands as compared with normal controls. Digital computer period analysis demonstrated more slow, less alpha, and more fast activity, as well as a greater average frequency and frequency deviation in both the primary wave and first derivative measurements in psychotic children than normals, while normals showed a trend towards higher amplitude and amplitude variability. The similarity of the EEG differences between psychotic and normal children to those differences observed between adult chronic schizophrenics and normals, as well as to those between children of "high risk" for becoming schizophrenic and controls, suggests that the above described findings are characteristic for the pathophysiology of schizophrenia.

  19. Quantitative ultrasond in the assessment of osteoporosis.

    Science.gov (United States)

    Guglielmi, Giuseppe; de Terlizzi, Francesca

    2009-09-01

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  20. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  1. [Pharmacoproteomic approach by quantitative targeted proteomics].

    Science.gov (United States)

    Ohtsuki, Sumio

    2012-01-01

    Omics analyses provided many candidates for drug targets and biomarkers. However, these analyses have not contributed to drug development efficiently because of top-down omics analyses. To solve this problem, we have recently developed quantitative targeted proteomics with multiplexed-multiple reaction monitoring (multiplexed-MRM) method, which enables us to perform bottom-up proteomics. In this method, the target proteins for quantification are selected prior to analysis based on the knowledge related to interesting phenomena. Target peptides for quantification are selected only from sequence information, so time-consuming procedures such as antibody preparation and protein purification are unnecessary. In this review, we introduce the technical features of multiplexed-MRM method as novel protein quantification method, and summarize its advantages with reference to recently reported results, including species differences, in vitro-to-in vivo reconstruction and personalized chemotherapy. This novel simultaneous protein quantification method overcomes problems of antibody-based quantification and would open new drug research based of protein as "Pharmacoproteomics".

  2. Multiple quantitative trait analysis using bayesian networks.

    Science.gov (United States)

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness.

  3. A Quantitative Index of Forest Structural Sustainability

    Directory of Open Access Journals (Sweden)

    Jonathan A. Cale

    2014-07-01

    Full Text Available Forest health is a complex concept including many ecosystem functions, interactions and values. We develop a quantitative system applicable to many forest types to assess tree mortality with respect to stable forest structure and composition. We quantify impacts of observed tree mortality on structure by comparison to baseline mortality, and then develop a system that distinguishes between structurally stable and unstable forests. An empirical multivariate index of structural sustainability and a threshold value (70.6 derived from 22 nontropical tree species’ datasets differentiated structurally sustainable from unsustainable diameter distributions. Twelve of 22 species populations were sustainable with a mean score of 33.2 (median = 27.6. Ten species populations were unsustainable with a mean score of 142.6 (median = 130.1. Among them, Fagus grandifolia, Pinus lambertiana, P. ponderosa, and Nothofagus solandri were attributable to known disturbances; whereas the unsustainability of Abies balsamea, Acer rubrum, Calocedrus decurrens, Picea engelmannii, P. rubens, and Prunus serotina populations were not. This approach provides the ecological framework for rational management decisions using routine inventory data to objectively: determine scope and direction of change in structure and composition, assess excessive or insufficient mortality, compare disturbance impacts in time and space, and prioritize management needs and allocation of scarce resources.

  4. Stable marriage problems with quantitative preferences

    CERN Document Server

    Pini, Maria Silvia; Venable, Brent; Walsh, Toby

    2010-01-01

    The stable marriage problem is a well-known problem of matching men to women so that no man and woman, who are not married to each other, both prefer each other. Such a problem has a wide variety of practical applications, ranging from matching resident doctors to hospitals, to matching students to schools or more generally to any two-sided market. In the classical stable marriage problem, both men and women express a strict preference order over the members of the other sex, in a qualitative way. Here we consider stable marriage problems with quantitative preferences: each man (resp., woman) provides a score for each woman (resp., man). Such problems are more expressive than the classical stable marriage problems. Moreover, in some real-life situations it is more natural to express scores (to model, for example, profits or costs) rather than a qualitative preference ordering. In this context, we de?fine new notions of stability and optimality, and we provide algorithms to find marriages which are stable and/...

  5. Surface reconstruction of Pt(001) quantitatively revisited

    Science.gov (United States)

    Hammer, R.; Meinel, K.; Krahn, O.; Widdra, W.

    2016-11-01

    The complex hexagonal reconstructions of the (001) surfaces of platinum and gold have been under debate for decades. Here, the structural details of the Pt(001) reconstruction have been quantitatively reinvestigated by combining the high resolving power of scanning tunneling microscopy (STM) and spot profile analysis low energy electron diffraction (SPA-LEED). In addition, LEED simulations based on a Moiré approach have been applied. Annealing temperatures around 850 °C yield a superstructure that approaches a commensurable c (26.6 ×118 ) substrate registry. It evolves from a Moiré-like buckling of a compressed hexagonal top layer (hex) where atomic rows of the hex run parallel to atomic rows of the square substrate. Annealing at 920 °C stimulates a continuous rotation of the hex where all angles between ±0.7° are simultaneously realized. At temperatures around 1080 °C, the nonrotated hex coexists with a hex that is rotated by about 0.75°. Annealing at temperatures around 1120 °C yield a locking of the hex in fixed rotation angles of 0.77°, 0.88°, and 0.94°. At temperatures around 1170 °C, the Pt(001)-hex-R 0.94° prevails as the energetically most favored form of the rotated hex.

  6. Quantitative Ultrasond in the assessment of Osteoporosis

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, Giuseppe [Department of Radiology, University of Foggia, Viale L. Pinto, 71100 Foggia (Italy); Department of Radiology, Scientific Institute Hospital, San Giovanni Rotondo (Italy)], E-mail: g.guglielmi@unifg.it; Terlizzi, Francesca de [IGEA srl, Via Parmenide 10/A 41012 Carpi, MO (Italy)], E-mail: f.deterlizzi@igeamedical.com

    2009-09-15

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  7. Global Quantitative Modeling of Chromatin Factor Interactions

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  8. Quantitative Resistance: More Than Just Perception of a Pathogen.

    Science.gov (United States)

    Corwin, Jason A; Kliebenstein, Daniel J

    2017-04-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. © 2017 American Society of Plant Biologists. All rights reserved.

  9. Quantitative Nutrient Analyzer for Autonomous Ocean Deployment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Translume, in collaboration with Dr Joseph Needoba, Oregon Health and Science University, proposes to develop a microfluidic colorimetric sensor for the quantitative...

  10. Quantitative Measurement of Oxygen in Microgravity Combustion

    Science.gov (United States)

    Silver, Joel A.

    1997-01-01

    A low-gravity environment, in space or in ground-based facilities such as drop towers, provides a unique setting for studying combustion mechanisms. Understanding the physical phenomena controlling the ignition and spread of flames in microgravity has importance for space safety as well as for better characterization of dynamical and chemical combustion processes which are normally masked by buoyancy and other gravity-related effects. Due to restrictions associated with performing measurements in reduced gravity, diagnostic methods which have been applied to microgravity combustion studies have generally been limited to capture of flame emissions on film or video, laser Schlieren imaging and (intrusive) temperature measurements using thermocouples. Given the development of detailed theoretical models, more sophisticated diagnostic methods are needed to provide the kind of quantitative data necessary to characterize the properties of microgravity combustion processes as well as provide accurate feedback to improve the predictive capabilities of the models. When the demands of space flight are considered, the need for improved diagnostic systems which are rugged, compact, reliable, and operate at low power becomes apparent. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in both microgravity combustion research and as a sensor on-board Spacelab as either an air quality monitor or as part of a fire detection system. In our prior microgravity work, an eight line-of-sight fiber optic system measured

  11. Quantitative imaging of coronary blood flow

    Directory of Open Access Journals (Sweden)

    Adam M. Alessio

    2010-04-01

    Full Text Available Adam M. Alessio received his PhD in Electrical Engineering from the University of Notre Dame in 2003. During his graduate studies he developed tomographic reconstruction methods for correlated data and helped construct a high-resolution PET system. He is currently a Research Assistant Professor in Radiology at the University of Washington. His research interests focus on improved data processing and reconstruction algorithms for PET/CT systems with an emphasis on quantitative imaging. Erik Butterworth recieved the BA degree in Mathematics from the University of Chicago in 1977. Between 1977 and 1987 he worked as a computer programmer/analyst for several small commercial software firms. Since 1988, he has worked as a software engineer on various research projects at the University of Washington. Between 1988 and 1993 he developed a real-time data aquisition for the analysis of estuarine sediment transport in the department of Geophysics. Between 1988 and 2002 he developed I4, a system for the display and analysis of cardic PET images in the department of Cardiology. Since 1993 he has worked on physiological simulation systems (XSIM from 1993 to 1999, JSim since 1999 at the National Simulation Resource Facility in Cirulatory Mass Transport and Exchange, in the Department of Bioengineering. His research interests include simulation systems and medical imaging. James H. Caldwell, MD, University of Missouri-Columbia 1970, is Professor of Medicine (Cardiology and Radiology and Adjunct Professor of Bioengineering at the University of Washington School of Medicine and Acting Head, Division of Cardiology and Director of Nuclear Cardiology for the University of Washington Hospitals, Seattle WA, USA. James B. Bassingthwaighte, MD, Toronto 1955, PhD Mayo Grad Sch Med 1964, was Professor of Physiology and of Medicine at Mayo Clinic until 1975 when he moved to the University of Washington to chair Bioengineering. He is Professor of Bioengineering and

  12. Asbestos exposure--quantitative assessment of risk

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  13. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  14. Quantitative DNA Analyses for Airborne Birch Pollen.

    Directory of Open Access Journals (Sweden)

    Isabell Müller-Germann

    Full Text Available Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR, which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8 and the other for a multi-copy gene (ITS. The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm, the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  15. Quantitative DNA Analyses for Airborne Birch Pollen.

    Science.gov (United States)

    Müller-Germann, Isabell; Vogel, Bernhard; Vogel, Heike; Pauling, Andreas; Fröhlich-Nowoisky, Janine; Pöschl, Ulrich; Després, Viviane R

    2015-01-01

    Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR), which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm) and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8) and the other for a multi-copy gene (ITS). The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm), the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  16. A quantitative theory of human color choices.

    Science.gov (United States)

    Komarova, Natalia L; Jameson, Kimberly A

    2013-01-01

    The system for colorimetry adopted by the Commission Internationale de l'Eclairage (CIE) in 1931, along with its subsequent improvements, represents a family of light mixture models that has served well for many decades for stimulus specification and reproduction when highly controlled color standards are important. Still, with regard to color appearance many perceptual and cognitive factors are known to contribute to color similarity, and, in general, to all cognitive judgments of color. Using experimentally obtained odd-one-out triad similarity judgments from 52 observers, we demonstrate that CIE-based models can explain a good portion (but not all) of the color similarity data. Color difference quantified by CIELAB ΔE explained behavior at levels of 81% (across all colors), 79% (across red colors), and 66% (across blue colors). We show that the unexplained variation cannot be ascribed to inter- or intra-individual variations among the observers, and points to the presence of additional factors shared by the majority of responders. Based on this, we create a quantitative model of a lexicographic semiorder type, which shows how different perceptual and cognitive influences can trade-off when making color similarity judgments. We show that by incorporating additional influences related to categorical and lightness and saturation factors, the model explains more of the triad similarity behavior, namely, 91% (all colors), 90% (reds), and 87% (blues). We conclude that distance in a CIE model is but the first of several layers in a hierarchy of higher-order cognitive influences that shape color triad choices. We further discuss additional mitigating influences outside the scope of CIE modeling, which can be incorporated in this framework, including well-known influences from language, stimulus set effects, and color preference bias. We also discuss universal and cultural aspects of the model as well as non-uniformity of the color space with respect to different

  17. Toward quantitative modeling of silicon phononic thermocrystals

    Energy Technology Data Exchange (ETDEWEB)

    Lacatena, V. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France); IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Haras, M.; Robillard, J.-F., E-mail: jean-francois.robillard@isen.iemn.univ-lille1.fr; Dubois, E. [IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Monfray, S.; Skotnicki, T. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France)

    2015-03-16

    The wealth of technological patterning technologies of deca-nanometer resolution brings opportunities to artificially modulate thermal transport properties. A promising example is given by the recent concepts of 'thermocrystals' or 'nanophononic crystals' that introduce regular nano-scale inclusions using a pitch scale in between the thermal phonons mean free path and the electron mean free path. In such structures, the lattice thermal conductivity is reduced down to two orders of magnitude with respect to its bulk value. Beyond the promise held by these materials to overcome the well-known “electron crystal-phonon glass” dilemma faced in thermoelectrics, the quantitative prediction of their thermal conductivity poses a challenge. This work paves the way toward understanding and designing silicon nanophononic membranes by means of molecular dynamics simulation. Several systems are studied in order to distinguish the shape contribution from bulk, ultra-thin membranes (8 to 15 nm), 2D phononic crystals, and finally 2D phononic membranes. After having discussed the equilibrium properties of these structures from 300 K to 400 K, the Green-Kubo methodology is used to quantify the thermal conductivity. The results account for several experimental trends and models. It is confirmed that the thin-film geometry as well as the phononic structure act towards a reduction of the thermal conductivity. The further decrease in the phononic engineered membrane clearly demonstrates that both phenomena are cumulative. Finally, limitations of the model and further perspectives are discussed.

  18. Quantitative paleogeography and accretionary history, northern Appalachians

    Energy Technology Data Exchange (ETDEWEB)

    Pluijm, B.A. van der; Voo, R. van der (Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Geological Sciences)

    1992-01-01

    Ongoing paleomagnetic work on Early and Middle Paleozoic units provides quantitative data on paleogeography, latitudinal separation and latitudinal drift rates of tectonic elements that characterize the history of the northern segment of the Appalachian orogen. Following rifting and opening of Iapetus, the southern margin of Laurentia moved from ca 15S in the Ordovician to ca. 30S in the late Silurian: the northern margin of Avalon drifted northward (separate from Gondwana) from > 50--30S during the same time interval. Paleolatitudes from volcanic units of the intervening Central Mobile Belt that yield primary magnetizations are: Newfoundland: Ordovician arc-back arc basin: 11[degree]S; Ordovician ocean island/arc: 31[degree]S; Silurian continental cover: Botwood Gp: 24[degree]S, Springdale Gp: 17[degree]S New Brunswick: Ordovician rift-subduction complex: 53[degree]S. Maine: Munsungun Volcanic Terrane 18[degree]S; Winterville Volcanic Terrane 15--20[degree]S; upper part Lunksoos Composite Terrane: 20[degree]S. The Ordovician results indicate several near-Laurentian volcanic terranes and back-arc basins, landward-dipping subduction complexes on opposite margins of Iapetus, and intra-Iapetus ocean islands/arcs. Silurian paleogeographic and tectonostratigraphic data show that closure of Iapetus and progressive outboard accretion in the northern portion of the Appalachian orogen was complete by the late Silurian. This closure is accompanied by considerable Ordovician to Early Silurian left-lateral strike slip and subsequent right-lateral displacement based on the relative positions of Laurentia, Avalon and Gondwana in Early and Middle Paleozoic times.

  19. Quantitative evaluation of chemisorption processes on semiconductors

    Science.gov (United States)

    Rothschild, A.; Komem, Y.; Ashkenasy, N.

    2002-12-01

    This article presents a method for numerical computation of the degree of coverage of chemisorbates and the resultant surface band bending as a function of the ambient gas pressure, temperature, and semiconductor doping level. This method enables quantitative evaluation of the effect of chemisorption on the electronic properties of semiconductor surfaces, such as the work function and surface conductivity, which is of great importance for many applications such as solid- state chemical sensors and electro-optical devices. The method is applied for simulating the chemisorption behavior of oxygen on n-type CdS, a process that has been investigated extensively due to its impact on the photoconductive properties of CdS photodetectors. The simulation demonstrates that the chemisorption of adions saturates when the Fermi level becomes aligned with the chemisorption-induced surface states, limiting their coverage to a small fraction of a monolayer. The degree of coverage of chemisorbed adions is proportional to the square root of the doping level, while neutral adsorbates are independent of the doping level. It is shown that the chemisorption of neutral adsorbates behaves according to the well-known Langmuir model, regardless of the existence of charged species on the surface, while charged adions do not obey Langmuir's isotherm. In addition, it is found that in depletive chemisorption processes the resultant surface band bending increases by 2.3kT (where k is the Boltzmann constant and T is the temperature) when the gas pressure increases by one order of magnitude or when the doping level increases by two orders of magnitude.

  20. Exploring Phytoplankton Population Investigation Growth to Enhance Quantitative Literacy

    Science.gov (United States)

    Baumgartner, Erin; Biga, Lindsay; Bledsoe, Karen; Dawson, James; Grammer, Julie; Howard, Ava; Snyder, Jeffrey

    2015-01-01

    Quantitative literacy is essential to biological literacy (and is one of the core concepts in "Vision and Change in Undergraduate Biology Education: A Call to Action"; AAAS 2009). Building quantitative literacy is a challenging endeavor for biology instructors. Integrating mathematical skills into biological investigations can help build…

  1. Workshop on quantitative dynamic stratigraphy. Final conference report

    Energy Technology Data Exchange (ETDEWEB)

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  2. Quantitative Theoretical and Conceptual Framework Use in Agricultural Education Research

    Science.gov (United States)

    Kitchel, Tracy; Ball, Anna L.

    2014-01-01

    The purpose of this philosophical paper was to articulate the disciplinary tenets for consideration when using theory in agricultural education quantitative research. The paper clarified terminology around the concept of theory in social sciences and introduced inaccuracies of theory use in agricultural education quantitative research. Finally,…

  3. Quantitative Approaches to Group Research: Suggestions for Best Practices

    Science.gov (United States)

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  4. Promoting Quantitative Literacy in an Online College Algebra Course

    Science.gov (United States)

    Tunstall, Luke; Bossé, Michael J.

    2016-01-01

    College algebra (a university freshman level algebra course) fulfills the quantitative literacy requirement of many college's general education programs and is a terminal course for most who take it. An online problem-based learning environment provides a unique means of engaging students in quantitative discussions and research. This article…

  5. Quantitative Phase Determination by Using a Michelson Interferometer

    Science.gov (United States)

    Pomarico, Juan A.; Molina, Pablo F.; D'Angelo, Cristian

    2007-01-01

    The Michelson interferometer is one of the best established tools for quantitative interferometric measurements. It has been, and is still successfully used, not only for scientific purposes, but it is also introduced in undergraduate courses for qualitative demonstrations as well as for quantitative determination of several properties such as…

  6. Orthogonal Series Methods for Both Qualitative and Quantitative Data

    OpenAIRE

    Hall, Peter

    1983-01-01

    We introduce and describe orthogonal series methods for estimating the density of qualitative, quantitative or mixed data. The techniques are completely nonparametric in character, and so may be used in situations where parametric models are difficult to construct. Just this situation arises in the context of mixed--both qualitative and quantitative--data, where there are few parametric models.

  7. Statistical mechanics and the evolution of polygenic quantitative traits

    NARCIS (Netherlands)

    Barton, N.H.; De Vladar, H.P.

    2009-01-01

    The evolution of quantitative characters depends on the frequencies of the alleles involved, yet these frequencies cannot usually be measured. Previous groups have proposed an approximation to the dynamics of quantitative traits, based on an analogy with statistical mechanics. We present a modified

  8. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study repor

  9. Optimization of statistical methods impact on quantitative proteomics data

    NARCIS (Netherlands)

    Pursiheimo, A.; Vehmas, A.P.; Afzal, S.; Suomi, T.; Chand, T.; Strauss, L.; Poutanen, M.; Rokka, A.; Corthals, G.L.; Elo, L.L.

    2015-01-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled

  10. Parts of the Whole : Cognition, Schemas, and Quantitative Reasoning

    Directory of Open Access Journals (Sweden)

    Dorothy Wallace

    2011-01-01

    Full Text Available Based loosely on ideas of Jean Piaget and Richard Skemp, this Parts of the Whole column considers the construction of knowledge in mathematics and quantitative reasoning. Examples are chosen that illustrate an important cognitive difference between quantitative numeracy and classical mathematics, and which illuminate the particular choices instructors must make in order to teach either or both of these.

  11. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  12. Quantitative Articles: Developing Studies for Publication in Counseling Journals

    Science.gov (United States)

    Trusty, Jerry

    2011-01-01

    This article is presented as a guide for developing quantitative studies and preparing quantitative manuscripts for publication in counseling journals. It is intended as an aid for aspiring authors in conceptualizing studies and formulating valid research designs. Material is presented on choosing variables and measures and on selecting…

  13. Quantitative modelling of the biomechanics of the avian syrinx

    NARCIS (Netherlands)

    Elemans, C.P.H.; Larsen, O.N.; Hoffmann, M.R.; Leeuwen, van J.L.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts

  14. Quantitative phase imaging using hard x-rays

    Energy Technology Data Exchange (ETDEWEB)

    Nugent, K.A.; Paganin, D.; Barnea, Z. [Melbourne Univ., Parkville, VIC (Australia). School of Physics; Cookson, D. F. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia); Gureyev, T.E. [Melbourne Univ., Parkville, VIC (Australia). School of Physics]|[CSIRO, Clayton, VIC (Australia). Div. of Forestry and Forest Products

    1997-06-01

    The quantitative imaging of a phase object using 16 keV x-rays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. It is found that the phase image is in quantitative agreement with independent measurements of the object. 13 refs., 5 figs.

  15. Blending Qualitative & Quantitative Research Methods in Theses and Dissertations.

    Science.gov (United States)

    Thomas, R. Murray

    This guide discusses combining qualitative and quantitative research methods in theses and dissertations. It covers a wide array of methods, the strengths and limitations of each, and how they can be effectively interwoven into various research designs. The first chapter is "The Qualitative and the Quantitative." Part 1, "A Catalogue of…

  16. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  17. Quantitative Approaches to Group Research: Suggestions for Best Practices

    Science.gov (United States)

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  18. Quantitative wearable sensors for objective assessment of Parkinson's disease

    NARCIS (Netherlands)

    Maetzler, W.; Domingos, J.; Srulijes, K.; Ferreira, J.J.; Bloem, B.R.

    2013-01-01

    There is a rapidly growing interest in the quantitative assessment of Parkinson's disease (PD)-associated signs and disability using wearable technology. Both persons with PD and their clinicians see advantages in such developments. Specifically, quantitative assessments using wearable technology ma

  19. Metstoich--Teaching Quantitative Metabolism and Energetics in Biochemical Engineering

    Science.gov (United States)

    Wong, Kelvin W. W.; Barford, John P.

    2010-01-01

    Metstoich, a metabolic calculator developed for teaching, can provide a novel way to teach quantitative metabolism to biochemical engineering students. It can also introduce biochemistry/life science students to the quantitative aspects of life science subjects they have studied. Metstoich links traditional biochemistry-based metabolic approaches…

  20. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  1. The Curriculum in Quantitative Analysis: Results of a Survey.

    Science.gov (United States)

    Locke, David C.; Grossman, William E. L.

    1987-01-01

    Reports on the results of a survey of college level instructors of quantitative analysis courses. Discusses what topics are taught in such courses, how much weight is given to these topics, and which experiments are used in the laboratory. Poses some basic questions about the curriculum in quantitative analysis. (TW)

  2. Metstoich--Teaching Quantitative Metabolism and Energetics in Biochemical Engineering

    Science.gov (United States)

    Wong, Kelvin W. W.; Barford, John P.

    2010-01-01

    Metstoich, a metabolic calculator developed for teaching, can provide a novel way to teach quantitative metabolism to biochemical engineering students. It can also introduce biochemistry/life science students to the quantitative aspects of life science subjects they have studied. Metstoich links traditional biochemistry-based metabolic approaches…

  3. Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.

    Science.gov (United States)

    Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W

    2017-01-01

    Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.

  4. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  5. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  6. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  7. An overview of quantitative approaches in Gestalt perception.

    Science.gov (United States)

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Quantitative risks analysis of maritime terminal petrochemical

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Leandro Silveira; Leal, Cesar A. [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica (PROMEC)]. E-mail: leandro19889900@yahoo.com.br

    2008-07-01

    This work consists of the application of a computer program (RISKAN) developed for studies of quantification of industrial risks and also a revision of the models used in the program. As part the evaluation made, a test was performed with the application of the computer program to estimate the risks for a marine terminal for storage of petrochemical products, in the city of Rio Grande, Brazil. Thus, as part of the work, it was performed a Quantitative Risk Analysis associated to the terminal, both for the workers and for the population nearby, with a verification of acceptability using the tolerability limits established by the State Licensing Agency (FEPAM-RS). In the risk analysis methodology used internationally, the most used way of presenting results of social risks is in the graphical form with the use of the FN curves and for the individual risk it is common the use of the iso-risk curves traced on the map of the area where is the plant. In the beginning of the study, both a historical analysis of accidents and use of the technique of Preliminary Analysis of Risks were made in order to aid in the process of identification of the possible scenarios of accidents related to the activities in the terminal. After identifying the initiating events, their frequencies or probabilities of occurrence were estimated and followed by the calculations of the physical effects and deaths, with the use, inside the computer program, of published models of Prins Mauritz Laboratory and of American Institute of Chemical Engineers. The average social risk obtained for the external populations was of 8.7x10{sup -7} fatality.year{sup -1} and for the internal population (people working inside the terminal), 3.2x10{sup -4} fatality.year-1. The accident scenario that most contributed to the social risk was death due to exposure to the thermal radiation caused by pool fire, with 84.3% of the total estimated for external populations and 82.9% for the people inside the terminal. The

  9. Ensemble postprocessing for probabilistic quantitative precipitation forecasts

    Science.gov (United States)

    Bentzien, S.; Friederichs, P.

    2012-12-01

    Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical

  10. Does Completion of Quantitative Courses Predict Better Quantitative Reasoning-in-Writing Proficiency?

    Directory of Open Access Journals (Sweden)

    Nathan D. Grawe

    2013-07-01

    Full Text Available Using data from Carleton College, this study explores the connection between students’ completion of a range of quantitative courses and the quality of their quantitative reasoning in writing (QRW as exhibited in courses throughout the undergraduate curriculum during the first two years of college. Because the assessment takes place in the context of a campus-wide initiative which has improved QRW on the whole, the study identifies course-taking patterns which predict stronger than average improvement. Results suggest QRW is not exceptionally improved by taking courses in statistics, principles of economics, or in the social sciences more broadly. QRW performance is, on the other hand, correlated strongly with having taken a first-year seminar specifically designed to teach QR thinking and communication. To a lesser degree, QRW is correlated with courses in the natural sciences and upper-level calculus. It is impossible to rule out all forms of selection bias explanations for these patterns. However, the broad pattern of correlations between QRW, courses, and standardized test scores argues for a causal interpretation.

  11. Quantitative Susceptibility Mapping and Dynamic Contrast Enhanced Quantitative Perfusion in Cerebral Cavernous Angiomas

    Science.gov (United States)

    Mikati, Abdul Ghani; Tan, Huan; Shenkar, Robert; Li, Luying; Zhang, Lingjiao; Guo, Xiaodong; Shi, Changbin; Liu, Tian; Wang, Yi; Shah, Akash; Edelman, Robert; Christoforidis, Gregory; Awad, Issam

    2015-01-01

    Background Hyperpermeability and iron deposition are two central pathophysiological phenomena in human cerebral cavernous malformation (CCM) disease. Here we used two novel magnetic resonance imaging (MRI) techniques to establish a relationship between these phenomena. Methods Subjects with CCM disease (4 sporadic and 18 familial) underwent MRI imaging using the Dynamic Contrast Enhanced Quantitative Perfusion (DCEQP) and Quantitative Susceptibility Mapping (QSM) techniques that measure hemodynamic factors of vessel leak and iron deposition respectively, previously demonstrated in CCM disease. Regions of interest encompassing the CCM lesions were analyzed using these techniques Results Susceptibility measured by QSM was positively correlated with permeability of lesions measured using DCEQP (r=0.49, p=<0.0001). The correlation was not affected by factors including familial predisposition, lesion volume, the contrast agent and the use of statin medication. Susceptibility was correlated with lesional blood volume (r=0.4, p=0.0001), but not with lesional blood flow. Conclusion The correlation between QSM and DCEQP suggests that the phenomena of permeability and iron deposition are related in CCM; hence “more leaky lesions” also manifest a more cumulative iron burden. These techniques might be used as biomarkers to monitor the course of this disease and the effect of therapy. PMID:24302484

  12. Semi-quantitative and quantitative studies on the gamma radiolysis of C5-BTBP

    Energy Technology Data Exchange (ETDEWEB)

    Fermvik, A.; Ekberg, C. [Chalmers Univ. of Tech., Goeteborg (Sweden). Dept. of Nuclear Chemistry; Chalmers Univ. of Technology, Goeteborg (Sweden). Industrial Materials Recycling; Gruener, B.; Kvicalova, M. [Academy of Sciences of the Czech Republic, Husinec-Rez near Prague (Czech Republic). Inst. of Inorganic Chemistry

    2011-07-01

    An industrial liquid-liquid extraction process for reprocessing of spent nuclear fuel will inevitably lead to radiolysis of the phases, since the process streams contain highly radioactive species. Solvents containing one of the BTBP (6,6'-bis(5,6-dialkyl-[1,2,4]-triazin-3-yl)-2,2'-bipyridine) molecules intended for the separation of trivalent actinides (An) from lanthanides (Ln), the so called C5-BTBP, have shown a dramatic decrease in both distribution ratios and An/Ln separation factor when irradiated; hence, the molecule is highly unstable towards radiolysis. HPLC-, APCI(+)-MS and LC-MS analyses were performed on irradiated solvents containing initially 0.005 M C5-BTBP dissolved in either hexanol or cyclohexanone. The decrease in concentration of starting molecule as well as the increase in concentration of various degradation products were studied with quantitative and semi-quantitative measurements. Structures were suggested for the degradation products produced in highest yields and these were compared to previously proposed structures for the same products. (orig.)

  13. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  14. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    Science.gov (United States)

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  15. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  16. Quantitative photoacoustic imaging of nanoparticles in cells and tissues.

    Science.gov (United States)

    Cook, Jason R; Frey, Wolfgang; Emelianov, Stanislav

    2013-02-26

    Quantitative visualization of nanoparticles in cells and tissues, while preserving the spatial information, is very challenging. A photoacoustic imaging technique to depict the presence and quantity of nanoparticles is presented. This technique is based on the dependence of the photoacoustic signal on both the nanoparticle quantity and the laser fluence. Quantitative photoacoustic imaging is a robust technique that does not require knowledge of the local fluence, but a relative change in the fluence. This eliminates the need for sophisticated methods or models to determine the energy distribution of light in turbid media. Quantitative photoacoustic imaging was first applied to nanoparticle-loaded cells, and quantitation was validated by inductively coupled plasma mass spectrometry. Quantitative photoacoustic imaging was then extended to xenograft tumor tissue sections, and excellent agreement with traditional histopathological analysis was demonstrated. Our results suggest that quantitative photoacoustic imaging may be used in many applications including the determination of the efficiency and effectiveness of molecular targeting strategies for cell studies and animal models, the quantitative assessment of photoacoustic contrast agent biodistribution, and the validation of in vivo photoacoustic imaging.

  17. 1, 2, 3, 4: infusing quantitative literacy into introductory biology.

    Science.gov (United States)

    Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.

  18. Quantification of transcript levels with quantitative RT-PCR.

    Science.gov (United States)

    Carleton, Karen L

    2011-01-01

    Differential gene expression is a key factor driving phenotypic divergence. Determining when and where gene expression has diverged between organisms requires a quantitative method. While large-scale approaches such as microarrays or high-throughput mRNA sequencing can identify candidates, quantitative RT-PCR is the definitive method for confirming gene expression differences. Here, we describe the steps for performing qRT-PCR including extracting total RNA, reverse-transcribing it to make a pool of cDNA, and then quantifying relative expression of a few candidate genes using real-time or quantitative PCR.

  19. The Quantitative Linear-Time–Branching-Time Spectrum

    DEFF Research Database (Denmark)

    Thrane, Claus; Fahrenberg, Uli; Legay, Axel

    2011-01-01

    We present a distance-agnostic approach to quantitative verification. Taking as input an unspecified distance on system traces, or executions, we develop a game-based framework which allows us to define a spectrum of different interesting system distances corresponding to the given trace distance....... Thus we extend the classic linear-time–branching-time spectrum to a quantitative setting, parametrized by trace distance. We also prove a general transfer principle which allows us to transfer counterexamples from the qualitative to the quantitative setting, showing that all system distances...

  20. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    Science.gov (United States)

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  1. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  2. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  3. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE.

    Science.gov (United States)

    Ojagbemi, A

    2017-06-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan Africa and Europe. The paper concludes that an integration of both the qualitative and quantitative research approaches may provide a better platform for unraveling the complex phenomenon of suicide in the elderly.

  4. Development of Quantitative Framework for Event Significance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Durk Hun; Kim, Min Chull [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kim, Inn Seock [ISSA Technology, Maryland (United States)

    2010-10-15

    There is an increasing trend in quantitative evaluation of the safety significance of operational events using Probabilistic Safety Assessment (PSA) technique. An integrated framework for evaluation of event significance has been developed by Korea Institute of Nuclear Safety (KINS), which consists of an assessment hierarchy and a number of matrices. The safety significance of various events, e.g., internal or external initiating events that occurred during at-power or shutdown conditions, can be quantitatively analyzed using this framework, and then, the events rated according to their significance. This paper briefly describes the basic concept of the integrated quantitative framework for evaluation of event significance, focusing on the assessment hierarchy

  5. Immunology by numbers: quantitation of antigen presentation completes the quantitative milieu of systems immunology!

    Science.gov (United States)

    Purcell, Anthony W; Croft, Nathan P; Tscharke, David C

    2016-06-01

    We review approaches to quantitate antigen presentation using a variety of biological and biochemical readouts and highlight the emerging role of mass spectrometry (MS) in defining and quantifying MHC-bound peptides presented at the cell surface. The combination of high mass accuracy in the determination of the molecular weight of the intact peptide of interest and its signature pattern of fragmentation during tandem MS provide an unambiguous and definitive identification. This is in contrast to the potential receptor cross-reactivity towards closely related peptides and variable dose responsiveness seen in biological readouts. In addition, we gaze into the not too distant future where big data approaches in MS can be accommodated to quantify whole immunopeptidomes both in vitro and in vivo.

  6. Childhood white matter disorders : quantitative MR imaging and spectroscopy

    NARCIS (Netherlands)

    van der Voorn, J Patrick; Pouwels, Petra J W; Hart, Augustinus A M; Serrarens, Judith; Willemsen, Michèl A A P; Kremer, Hubertus P H; Barkhof, Frederik; van der Knaap, Marjo S

    2006-01-01

    PURPOSE: To prospectively investigate whether quantitative magnetic resonance (MR) parameters, including magnetization transfer ratio (MTR), apparent diffusion coefficient (ADC), fractional anisotropy (FA), and MR spectroscopic metabolite concentrations, allow for discrimination between different ty

  7. Childhood white matter disorders: quantitative MR imaging and spectroscopy.

    NARCIS (Netherlands)

    Voorn, J.P. van der; Pouwels, P.J.; Hart, A.A.M.; Serrarens, J.; Willemsen, M.A.A.P.; Kremer, H.P.H.; Barkhof, F.; Knaap, M.S. van der

    2006-01-01

    PURPOSE: To prospectively investigate whether quantitative magnetic resonance (MR) parameters, including magnetization transfer ratio (MTR), apparent diffusion coefficient (ADC), fractional anisotropy (FA), and MR spectroscopic metabolite concentrations, allow for discrimination between different ty

  8. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    Science.gov (United States)

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  9. Lesion detection and quantitation of positron emission mammography

    Energy Technology Data Exchange (ETDEWEB)

    Qi, Jinyi; Huesman, Ronald H.

    2001-12-01

    A Positron Emission Mammography (PEM) scanner dedicated to breast imaging is being developed at our laboratory. We have developed a list mode likelihood reconstruction algorithm for this scanner. Here we theoretically study the lesion detection and quantitation. The lesion detectability is studied theoretically using computer observers. We found that for the zero-order quadratic prior, the region of interest observer can achieve the performance of the prewhitening observer with a properly selected smoothing parameter. We also study the lesion quantitation using the test statistic of the region of interest observer. The theoretical expressions for the bias, variance, and ensemble mean squared error of the quantitation are derived. Computer simulations show that the theoretical predictions are in good agreement with the Monte Carlo results for both lesion detection and quantitation.

  10. Quantitative stem cell biology: the threat and the glory.

    Science.gov (United States)

    Pollard, Steven M

    2016-11-15

    Major technological innovations over the past decade have transformed our ability to extract quantitative data from biological systems at an unprecedented scale and resolution. These quantitative methods and associated large datasets should lead to an exciting new phase of discovery across many areas of biology. However, there is a clear threat: will we drown in these rivers of data? On 18th July 2016, stem cell biologists gathered in Cambridge for the 5th annual Cambridge Stem Cell Symposium to discuss 'Quantitative stem cell biology: from molecules to models'. This Meeting Review provides a summary of the data presented by each speaker, with a focus on quantitative techniques and the new biological insights that are emerging. © 2016. Published by The Company of Biologists Ltd.

  11. Quantitative Assays for RAS Pathway Proteins and Phosphorylation States

    Science.gov (United States)

    The NCI CPTAC program is applying its expertise in quantitative proteomics to develop assays for RAS pathway proteins. Targets include key phosphopeptides that should increase our understanding of how the RAS pathway is regulated.

  12. Collecting data for quantitative research on pluvial flooding

    NARCIS (Netherlands)

    Spekkers, M.H.; Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.

    2011-01-01

    Urban pluvial flood management requires detailed spatial and temporal information on flood characteristics and damaging consequences. There is lack of quantitative field data on pluvial flooding resulting in large uncertainties in urban flood model calculations and ensuing decisions for investments

  13. qualitative and quantitative methods of suicide research in old age

    African Journals Online (AJOL)

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the .... Moreover, this type of data collection method may engender a higher .... illustrated ed: Springer Science & Business Media;. 2011. 204. 13.

  14. Some thoughts on humanitarian logistics and quantitative methods

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2008-05-01

    Full Text Available Some of the research issues in humanitarian logistics and quantitative methods discussed in this presentation are Identifying people in a disaster; Facilitating movement of people and aid; Geographic Information Services (GIS) to support...

  15. Quantitative comparison of ammonia and 3-indoleacetic acid ...

    African Journals Online (AJOL)

    Quantitative comparison of ammonia and 3-indoleacetic acid production in ... method and 3-indoleacetic acid as Salkowski method in halophilic, alkalophilic and ... in research due to their ease of implementation and relatively accurate results.

  16. Collecting data for quantitative research on pluvial flooding

    NARCIS (Netherlands)

    Spekkers, M.H.; Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.

    2011-01-01

    Urban pluvial flood management requires detailed spatial and temporal information on flood characteristics and damaging consequences. There is lack of quantitative field data on pluvial flooding resulting in large uncertainties in urban flood model calculations and ensuing decisions for investments

  17. Researching on quantitative project management plan and implementation method

    Science.gov (United States)

    Wang, Xin; Ren, Aihua; Liu, Xiangshang

    2017-08-01

    With the practice of high maturity process improvement, more and more attention has been paid to CMMI and other process improvement frameworks. The key to improve the process of high maturity is to quantify the process. At present, the method of improving the software process of high maturity is lack of specific introduction to the corresponding improvement link or process implementation. In this paper, based on the current improvement in the quantitative management of the framework and statistical analysis technical of the high maturity recommended for the enterprise to improve the process of planning and implementation methods. These methods provide quantitative process management for the enterprise, as well as quantitative management of the project to provide a systematic process, and finally evaluate the effectiveness of quantitative management projects. Finally, this method is used to verify the effectiveness of the framework in guiding the enterprise to improve the process of high maturity.

  18. Quantitative aspects of oxygen and carbon dioxide exchange ...

    African Journals Online (AJOL)

    Quantitative aspects of oxygen and carbon dioxide exchange through the lungs in ... and O2 exchange rates using the method of instantaneous measurements of ... density of 1.044, float nearly weightless with a minimum of body movements.

  19. Modeling the Effect of Polychromatic Light in Quantitative Absorbance Spectroscopy

    Science.gov (United States)

    Smith, Rachel; Cantrell, Kevin

    2007-01-01

    Laboratory experiment is conducted to give the students practical experience with the principles of electronic absorbance spectroscopy. This straightforward approach creates a powerful tool for exploring many of the aspects of quantitative absorbance spectroscopy.

  20. Quantitative interferometric microscopy cytometer based on regularized optical flow algorithm

    Science.gov (United States)

    Xue, Liang; Vargas, Javier; Wang, Shouyu; Li, Zhenhua; Liu, Fei

    2015-09-01

    Cell detections and analysis are important in various fields, such as medical observations and disease diagnoses. In order to analyze the cell parameters as well as observe the samples directly, in this paper, we present an improved quantitative interferometric microscopy cytometer, which can monitor the quantitative phase distributions of bio-samples and realize cellular parameter statistics. The proposed system is able to recover the phase imaging of biological samples in the expanded field of view via a regularized optical flow demodulation algorithm. This algorithm reconstructs the phase distribution with high accuracy with only two interferograms acquired at different time points simplifying the scanning system. Additionally, the method is totally automatic, and therefore it is convenient for establishing a quantitative phase cytometer. Moreover, the phase retrieval approach is robust against noise and background. Excitingly, red blood cells are readily investigated with the quantitative interferometric microscopy cytometer system.

  1. Application study of transport intensity equation in quantitative phase reconstruction

    Science.gov (United States)

    Song, Xiaojun; Cheng, Wei; Wei, Chunjuan; Xue, Liang; Liu, Weijing; Bai, Baodan; Chu, Fenghong

    2016-10-01

    In order to improve detection speed and accuracy of biological cells, a quantitative non-interference optical phase recovery method is proposed in commercial microscope, taking the red blood cells as the classical phase objects. Three bright field micrographs were collected in the experiment. Utilizing the transport intensity equation (TIE), the quantitative phase distributions of red blood cell are gained and agree well with the previous optical phase models. Analysis shows that the resolution of introduced system reaches sub-micron. This method not only quickly gives quantitative phase distribution of cells, but also measures a large number of cells simultaneously. So it is potential in the use of real-time observing and quantitative analyzing of cells in vivo.

  2. Doing Quantitative Grounded Theory: A theory of trapped travel consumption

    Directory of Open Access Journals (Sweden)

    Mark S. Rosenbaum, Ph.D.

    2008-11-01

    Full Text Available All is data. Grounded theorists employ this sentence in their quest to create original theoretical frameworks. Yet researchers typically interpret the word gdatah to mean qualitative data or, more specifically, interview data collected from respondents. This is not to say that qualitative data is deficient; however, grounded theorists may be missing vast opportunities to create pioneering theories from quantitative data. Indeed, Glaser and Strauss (1967 argued that researchers would use qualitative and/or quantitative data to fashion original frameworks and related hypotheses, and Glaserfs (2008 recently published book, titledDoing Quantitative Grounded Theory, is an attempt to help researchers understand how to use quantitative data for grounded theory (GT.

  3. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Soichiro; Araki; Itaru; Nishioka; Yoshihiko; Suemura

    2003-01-01

    This paper proposes two migration scenarios from China ring networks to ASON mesh networks. In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  4. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Guoying Zhang; Soichiro Araki; Itaru Nishioka; Yoshihiko Suemura

    2003-01-01

    This paper proposes two migration scenarios from China rin g networks to ASON mesh networks . In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  5. Quantitative and qualitative analysis of sterols/sterolins and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-06-03

    Jun 3, 2008 ... Quantitative and qualitative analysis of sterols/sterolins ... method was developed to identify and quantify sterols (especially β-sitosterol) in chloroform extracts of ... Studies with phytosterols, especially β-sitosterol, have.

  6. Indium adhesion provides quantitative measure of surface cleanliness

    Science.gov (United States)

    Krieger, G. L.; Wilson, G. J.

    1968-01-01

    Indium tipped probe measures hydrophobic and hydrophilic contaminants on rough and smooth surfaces. The force needed to pull the indium tip, which adheres to a clean surface, away from the surface provides a quantitative measure of cleanliness.

  7. The potential and limitations of quantitative electromyography in equine medicine

    NARCIS (Netherlands)

    Wijnberg, Inge D; Franssen, Hessel

    2015-01-01

    This review discusses the scope of using (quantitative) electromyography (EMG) in diagnosing myopathies and neuropathies in equine patients. In human medicine, many EMG methods are available for the diagnosis, pathophysiological description and evaluation, monitoring, or rehabilitation of patients,

  8. A novel quantitative light‑induced fluorescence device for monitoring ...

    African Journals Online (AJOL)

    2015-08-06

    Aug 6, 2015 ... clinician to record the subject's identification, date of visit, and images of ... provided automatic quantitative analysis of lesions via its. [Downloaded .... logistics of dental health care, one of the deciding factors is its economic ...

  9. A Practical Qualitative+Quantitative Method-S-ANN

    Institute of Scientific and Technical Information of China (English)

    GUAN Wei; SHEN Jin-sheng; LI Peng-fei

    2002-01-01

    In this paper, a practical qualitative +quantitative method named S-ANN is proposed as a forecasting tool, in which the artificial neural network (ANN) of AI is used to handle the quantitative knowledge and the SCENARIO method of systems engineering is used to handle the qualitative knowledge respectively. As a case study, S-ANN method is employed to forecast the ridership of Beijing public transportation, the results show that S-ANN method possesses advantages of feasibility and easily to operate.

  10. Mineralization of human premolar occlusal fissures: a quantitative histochemical microanalysis

    OpenAIRE

    Campos, Antonio; Rodriguez, I. A.; Sanchez-Quevedo, M.C.; García, J. M.; Nieto-Albano, O.H.; Gómez de Ferraris, M. E.

    2000-01-01

    The mechanisms of cariogenesis in occlusal fissures remain elusive because of limited information about fissure structure and wall mineralization. The purpose of the present study was to determine the correlation between morphological patterns in occlusal fissures in human premolars and quantitative histochemical patterns of mineralization in the walls of these formations. We used scanning electron microscopy and quantitative X-ray microanalysis with the peak-t...

  11. Quantitative Evaluation of Bioorthogonal Chemistries for Surface Functionalization of Nanoparticles

    DEFF Research Database (Denmark)

    Feldborg, Lise Nørkjær; Jølck, Rasmus Irming; Andresen, Thomas Lars

    2012-01-01

    We present here a highly efficient and chemoselective liposome functionalization method based on oxime bond formation between a hydroxylamine and an aldehyde-modified lipid component. We have conducted a systematic and quantitative comparison of this new approach with other state-of-the-art...... affinity between the peptide and the liposome surface. These studies demonstrate the importance of hoosing the correct chemistry in order to obtain a quantitative surface functionalization of liposomes....

  12. Quantitative-PCR Assessment of Cryptosporidium parvum Cell Culture Infection

    OpenAIRE

    Di Giovanni, George D.; LeChevallier, Mark W.

    2005-01-01

    A quantitative TaqMan PCR method was developed for assessing the Cryptosporidium parvum infection of in vitro cultivated human ileocecal adenocarcinoma (HCT-8) cell cultures. This method, termed cell culture quantitative sequence detection (CC-QSD), has numerous applications, several of which are presented. CC-QSD was used to investigate parasite infection in cell culture over time, the effects of oocyst treatment on infectivity and infectivity assessment of different C. parvum isolates. CC-Q...

  13. Quantitative Convergence of Concepts in Physical Cosmology and Theology

    Science.gov (United States)

    Persinger, Michael A.; Burke, Ryan C.; Carniello, Trevor N.

    2012-09-01

    Physical cosmology and theology both explore the maximum boundary conditions of space and time. The possibility of consciousness and information involving the largest and smallest spaces and times within the universe is supported quantitatively by the physical properties of matter and the organization of the human brain. There are important roles for both approaches as required contrasts to discern the neurocognitive and quantitative equivalents that could facilitate discovery.

  14. Partitioning and lipophilicity in quantitative structure-activity relationships.

    OpenAIRE

    Dearden, J. C.

    1985-01-01

    The history of the relationship of biological activity to partition coefficient and related properties is briefly reviewed. The dominance of partition coefficient in quantitation of structure-activity relationships is emphasized, although the importance of other factors is also demonstrated. Various mathematical models of in vivo transport and binding are discussed; most of these involve partitioning as the primary mechanism of transport. The models describe observed quantitative structure-ac...

  15. The Quantitative-Qualitative Controversy in Marketing Research

    OpenAIRE

    Brunello Adrian; Petruºcã Claudia-Ioana

    2011-01-01

    A critical point in the process of establishing a research methodology is represented by the choice related to the type of analysis that should be used: quantitative or qualitative. The arguments in favor of quantitative or qualitative research have been the subject of a large number of scientific articles. The difference that can be made between the two research methods refers to each article’s technical specificity and style. In a marketing research, in order to respond to the requirements ...

  16. Qualitative vs. quantitative software process simulation modelling: conversion and comparison

    OpenAIRE

    Zhang, He; Kitchenham, Barbara; Jeffery, Ross

    2009-01-01

    peer-reviewed Software Process Simulation Modeling (SPSM) research has increased in the past two decades. However, most of these models are quantitative, which require detailed understanding and accurate measurement. As the continuous work to our previous studies in qualitative modeling of software process, this paper aims to investigate the structure equivalence and model conversion between quantitative and qualitative process modeling, and to compare the characteristics and performance o...

  17. A quantitative analysis of Salmonella Typhimurium metabolism during infection

    OpenAIRE

    Steeb, Benjamin

    2012-01-01

    In this thesis, Salmonella metabolism during infection was investigated. The goal was to gain a quantitative and comprehensive understanding of Salmonella in vivo nutrient supply, utilization and growth. To achieve this goal, we used a combined experimental / in silico approach. First, we generated a reconstruction of Salmonella metabolism ([1], see 2.1). This reconstruction was then combined with in vivo data from experimental mutant phenotypes to build a comprehensive quantitative in viv...

  18. A Solved Model to Show Insufficiency of Quantitative Adiabatic Condition

    Institute of Scientific and Technical Information of China (English)

    LIU Long-Jiang; LIU Yu-Zhen; TONG Dian-Min

    2009-01-01

    The adiabatic theorem is a useful tool in processing quantum systems slowly evolving,but its practical application depends on the quantitative condition expressed by Hamiltonian's eigenvalues and eigenstates,which is usually taken as a sufficient condition.Recently,the sumciency of the condition was questioned,and several counterex amples have been reported.Here we present a new solved model to show the insufficiency of the traditional quantitative adiabatic condition.

  19. Quantitative MRI analysis of dynamic enhancement of focal liver lesions

    Directory of Open Access Journals (Sweden)

    S. S. Bagnenko

    2012-01-01

    Full Text Available In our study 45 patients with different focal liver lesions (110 nodules were examined using high field MR-system (1,5 T. During this investigation quantitative MRI analysis of dynamic enhancement of various hepatic lesions and parenchymatous organs of abdomen were performed. It was shown that quantitative evaluation of enhanced MRI improves understanding of vascular transformation processes in pathologic hepatic focuses and in liver itself that is important for differential diagnoses of these diseases.

  20. Quantitative Information Flow as Safety and Liveness Hyperproperties

    Directory of Open Access Journals (Sweden)

    Hirotoshi Yasuoka

    2012-07-01

    Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.

  1. Sexual Harassment Prevention Initiatives: Quantitative and Qualitative Approaches

    Science.gov (United States)

    2010-10-28

    is met. Gelo, Braakman, and Benetka (2008) describe qualitative and quantitative paradigms as such: Quantitative paradigms see reality as single...and tangible, where the knower and the known are considered as relatively separate and independent. Qualitative paradigms , however, view reality as a...experimental research or the positivist approach) will typically be utilized to explore and answer questions about relationships with measured variables that

  2. Classification of cassava genotypes based on qualitative and quantitative data.

    Science.gov (United States)

    Oliveira, E J; Oliveira Filho, O S; Santos, V S

    2015-02-02

    We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.

  3. Quantitative photoacoustic image reconstruction improves accuracy in deep tissue structures.

    Science.gov (United States)

    Mastanduno, Michael A; Gambhir, Sanjiv S

    2016-10-01

    Photoacoustic imaging (PAI) is emerging as a potentially powerful imaging tool with multiple applications. Image reconstruction for PAI has been relatively limited because of limited or no modeling of light delivery to deep tissues. This work demonstrates a numerical approach to quantitative photoacoustic image reconstruction that minimizes depth and spectrally derived artifacts. We present the first time-domain quantitative photoacoustic image reconstruction algorithm that models optical sources through acoustic data to create quantitative images of absorption coefficients. We demonstrate quantitative accuracy of less than 5% error in large 3 cm diameter 2D geometries with multiple targets and within 22% error in the largest size quantitative photoacoustic studies to date (6cm diameter). We extend the algorithm to spectral data, reconstructing 6 varying chromophores to within 17% of the true values. This quantitiative PA tomography method was able to improve considerably on filtered-back projection from the standpoint of image quality, absolute, and relative quantification in all our simulation geometries. We characterize the effects of time step size, initial guess, and source configuration on final accuracy. This work could help to generate accurate quantitative images from both endogenous absorbers and exogenous photoacoustic dyes in both preclinical and clinical work, thereby increasing the information content obtained especially from deep-tissue photoacoustic imaging studies.

  4. Refining the quantitative pathway of the Pathways to Mathematics model.

    Science.gov (United States)

    Sowinski, Carla; LeFevre, Jo-Anne; Skwarchuk, Sheri-Lynn; Kamawar, Deepthi; Bisanz, Jeffrey; Smith-Chant, Brenda

    2015-03-01

    In the current study, we adopted the Pathways to Mathematics model of LeFevre et al. (2010). In this model, there are three cognitive domains--labeled as the quantitative, linguistic, and working memory pathways--that make unique contributions to children's mathematical development. We attempted to refine the quantitative pathway by combining children's (N=141 in Grades 2 and 3) subitizing, counting, and symbolic magnitude comparison skills using principal components analysis. The quantitative pathway was examined in relation to dependent numerical measures (backward counting, arithmetic fluency, calculation, and number system knowledge) and a dependent reading measure, while simultaneously accounting for linguistic and working memory skills. Analyses controlled for processing speed, parental education, and gender. We hypothesized that the quantitative, linguistic, and working memory pathways would account for unique variance in the numerical outcomes; this was the case for backward counting and arithmetic fluency. However, only the quantitative and linguistic pathways (not working memory) accounted for unique variance in calculation and number system knowledge. Not surprisingly, only the linguistic pathway accounted for unique variance in the reading measure. These findings suggest that the relative contributions of quantitative, linguistic, and working memory skills vary depending on the specific cognitive task. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  6. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    Science.gov (United States)

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  7. The Case for Infusing Quantitative Literacy into Introductory Geoscience Courses

    Directory of Open Access Journals (Sweden)

    Jennifer M. Wenner

    2009-01-01

    Full Text Available We present the case for introductory geoscience courses as model venues for increasing the quantitative literacy (QL of large numbers of the college-educated population. The geosciences provide meaningful context for a number of fundamental mathematical concepts that are revisited several times in a single course. Using some best practices from the mathematics education community surrounding problem solving, calculus reform, pre-college mathematics and five geoscience/math workshops, geoscience and mathematics faculty have identified five pedagogical ideas to increase the QL of the students who populate introductory geoscience courses. These five ideas include techniques such as: place mathematical concepts in context, use multiple representations, use technology appropriately, work in groups, and do multiple-day, in-depth problems that place quantitative skills in multiple contexts. We discuss the pedagogical underpinnings of these five ideas and illustrate some ways that the geosciences represent ideal places to use these techniques. However, the inclusion of QL in introductory courses is often met with resistance at all levels. Faculty who wish to include quantitative content must use creative means to break down barriers of public perception of geoscience as qualitative, administrative worry that enrollments will drop and faculty resistance to change. Novel ways to infuse QL into geoscience classrooms include use of web-based resources, shadow courses, setting clear expectations, and promoting quantitative geoscience to the general public. In order to help faculty increase the QL of geoscience students, a community-built faculty-centered web resource (Teaching Quantitative Skills in the Geosciences houses multiple examples that implement the five best practices of QL throughout the geoscience curriculum. We direct faculty to three portions of the web resource: Teaching Quantitative Literacy, QL activities, and the 2006 workshop website

  8. Absolute quantitation of stevioside and rebaudioside A in commercial standards by quantitative NMR.

    Science.gov (United States)

    Tada, Atsuko; Takahashi, Kana; Ishizuki, Kyoko; Sugimoto, Naoki; Suematsu, Takako; Arifuku, Kazunori; Tahara, Maiko; Akiyama, Takumi; Ito, Yusai; Yamazaki, Takeshi; Akiyama, Hiroshi; Kawamura, Yoko

    2013-01-01

    The extract prepared from the leaves of Stevia rebaudiana BERTONI (Asteraceae) contains sweet steviol glycosides, mainly stevioside and rebaudioside A. Highly purified stevia extracts have become popular worldwide as a natural, low-calorie sweetener. They contain various types of steviol glycosides, and their main components are stevioside and rebaudioside A. The content of each steviol glycoside is quantified by comparing the ratios of the molecular weights and the chromatographic peak areas of the samples to those of stevioside or rebaudioside A standards of the Food and Agriculture Organization of the United Nations (FAO)/World Health Organization (WHO) Joint Expert Committee on Food Additives (JECFA) and other specifications. However, various commercial standard reagents of stevioside and rebaudioside A are available. Their purities are different and their exact purities are not indicated. Therefore, the measured values of stevioside and rebaudioside A contained in a sample vary according to the standard used for the quantification. In this study, we utilized an accurate method, quantitative NMR (qNMR), for determining the contents of stevioside and rebaudioside A in standards, with traceability to the International System of Units (SI units). The purities of several commercial standards were determined to confirm their actual values.

  9. [Quantitative study on esophageal cytology. I. Quantitative morphologic studies of normal, dysplastic and malignant squamous cells].

    Science.gov (United States)

    Xiang, Y V

    1990-03-01

    On cytosmears of esophageal epithelium of individuals from high-risk area of esophageal cancer squamous epithelial cells, according to standard cytologic diagnostic criteria, can be categorized as normal, hyperplasia, severely dysplastic grade I and grade II, nearly-carcinoma and early carcinoma. Cytosmears from 60 patients, 10 for each category, were studied with a semiautomatic image analysis system. Thirteen morphologic parameters so obtained were further analyzed by computer-based stepwise regression and linear correlation analyses. The results showed that the following 5 parameters could be used to judge the nature of the cells, i.e. a) cytoplasmic area, b) cytoplasmic mean diameter, c) cytoplasmic form factor, d) nuclear form factor and e) N/C ratio. Comparing with cells of the other categories, values of the first 4 parameters for early cancer cells were decreased whereas that of the fifth parameter was significantly increased. From normal to hyperplastic and to dysplastic cells, the nuclear area and mean nuclear diameter were gradually increasing. Therefore, they were the major parameters in judging the degree of hyperplasia and dysplasia. These numerical features of morphologic quantitation conformed with the cytologic diagnostic criteria for cancer, hyperplasia and dysplasia under light microscope. It indicates that visual judgement is relatively accurate and application of the ocular micrometer to measure the cells would make this grading more objective.

  10. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    Science.gov (United States)

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-04-13

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m(2) using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m(2) . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The PSV, EDV, and pixel intensity are valuable in determining moderate to severe CKD. The value of shear wave velocity in

  11. Ultrafast quantitative time-stretch imaging flow cytometry of phytoplankton

    Science.gov (United States)

    Lai, Queenie T. K.; Lau, Andy K. S.; Tang, Anson H. L.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2016-03-01

    Comprehensive quantification of phytoplankton abundance, sizes and other parameters, e.g. biomasses, has been an important, yet daunting task in aquatic sciences and biofuel research. It is primarily because of the lack of effective tool to image and thus accurately profile individual microalgae in a large population. The phytoplankton species are highly diversified and heterogeneous in terms of their sizes and the richness in morphological complexity. This fact makes time-stretch imaging, a new ultrafast real-time optical imaging technology, particularly suitable for ultralarge-scale taxonomic classification of phytoplankton together with quantitative image recognition and analysis. We here demonstrate quantitative imaging flow cytometry of single phytoplankton based on quantitative asymmetric-detection time-stretch optical microscopy (Q-ATOM) - a new time-stretch imaging modality for label-free quantitative phase imaging without interferometric implementations. Sharing the similar concept of Schlieren imaging, Q-ATOM accesses multiple phase-gradient contrasts of each single phytoplankton, from which the quantitative phase profile is computed. We employ such system to capture, at an imaging line-scan rate of 11.6 MHz, high-resolution images of two phytoplankton populations (scenedesmus and chlamydomonas) in ultrafast microfluidic flow (3 m/s). We further perform quantitative taxonomic screening analysis enabled by this technique. More importantly, the system can also generate quantitative phase images of single phytoplankton. This is especially useful for label-free quantification of biomasses (e.g. lipid droplets) of the particular species of interest - an important task adopted in biofuel applications. Combining machine learning for automated classification, Q-ATOM could be an attractive platform for continuous and real-time ultralarge-scale single-phytoplankton analysis.

  12. Envisioning a Quantitative Studies Center: A Liberal Arts Perspective

    Directory of Open Access Journals (Sweden)

    Gizem Karaali

    2010-01-01

    Full Text Available Several academic institutions are searching for ways to help students develop their quantitative reasoning abilities and become more adept at higher-level tasks that involve quantitative skills. In this note we study the particular way Pomona College has framed this issue within its own context and what it plans to do about it. To this end we describe our efforts as members of a campus-wide committee that was assigned the duty of investigating the feasibility of founding a quantitative studies center on our campus. These efforts involved analysis of data collected through a faculty questionnaire, discipline-specific input obtained from each departmental representative, and a survey of what some of our peer institutions are doing to tackle these issues. In our studies, we identified three critical needs where quantitative support would be most useful in our case: tutoring and mentoring for entry-level courses; support for various specialized and analytic software tools for upper-level courses; and a uniform basic training for student tutors and mentors. We surmise that our challenges can be mitigated effectively via the formation of a well-focused and -planned quantitative studies center. We believe our process, findings and final proposal will be helpful to others who are looking to resolve similar issues on their own campuses.

  13. The new AP Physics exams: Integrating qualitative and quantitative reasoning

    Science.gov (United States)

    Elby, Andrew

    2015-04-01

    When physics instructors and education researchers emphasize the importance of integrating qualitative and quantitative reasoning in problem solving, they usually mean using those types of reasoning serially and separately: first students should analyze the physical situation qualitatively/conceptually to figure out the relevant equations, then they should process those equations quantitatively to generate a solution, and finally they should use qualitative reasoning to check that answer for plausibility (Heller, Keith, & Anderson, 1992). The new AP Physics 1 and 2 exams will, of course, reward this approach to problem solving. But one kind of free response question will demand and reward a further integration of qualitative and quantitative reasoning, namely mathematical modeling and sense-making--inventing new equations to capture a physical situation and focusing on proportionalities, inverse proportionalities, and other functional relations to infer what the equation ``says'' about the physical world. In this talk, I discuss examples of these qualitative-quantitative translation questions, highlighting how they differ from both standard quantitative and standard qualitative questions. I then discuss the kinds of modeling activities that can help AP and college students develop these skills and habits of mind.

  14. Data-driven encoding for quantitative genetic trait prediction.

    Science.gov (United States)

    He, Dan; Wang, Zhanyong; Parida, Laxmi

    2015-01-01

    Given a set of biallelic molecular markers, such as SNPs, with genotype values on a collection of plant, animal or human samples, the goal of quantitative genetic trait prediction is to predict the quantitative trait values by simultaneously modeling all marker effects. Quantitative genetic trait prediction is usually represented as linear regression models which require quantitative encodings for the genotypes: the three distinct genotype values, corresponding to one heterozygous and two homozygous alleles, are usually coded as integers, and manipulated algebraically in the model. Further, epistasis between multiple markers is modeled as multiplication between the markers: it is unclear that the regression model continues to be effective under this. In this work we investigate the effects of encodings to the quantitative genetic trait prediction problem. We first showed that different encodings lead to different prediction accuracies, in many test cases. We then proposed a data-driven encoding strategy, where we encode the genotypes according to their distribution in the phenotypes and we allow each marker to have different encodings. We show in our experiments that this encoding strategy is able to improve the performance of the genetic trait prediction method and it is more helpful for the oligogenic traits, whose values rely on a relatively small set of markers. To the best of our knowledge, this is the first paper that discusses the effects of encodings to the genetic trait prediction problem.

  15. Extended Rearrangement Inequalities and Applications to Some Quantitative Stability Results

    Science.gov (United States)

    Lemou, Mohammed

    2016-09-01

    In this paper, we prove a new functional inequality of Hardy-Littlewood type for generalized rearrangements of functions. We then show how this inequality provides quantitative stability results of steady states to evolution systems that essentially preserve the rearrangements and some suitable energy functional, under minimal regularity assumptions on the perturbations. In particular, this inequality yields a quantitative stability result of a large class of steady state solutions to the Vlasov-Poisson systems, and more precisely we derive a quantitative control of the L 1 norm of the perturbation by the relative Hamiltonian (the energy functional) and rearrangements. A general non linear stability result has been obtained by Lemou et al. (Invent Math 187:145-194, 2012) in the gravitational context, however the proof relied in a crucial way on compactness arguments which by construction provides no quantitative control of the perturbation. Our functional inequality is also applied to the context of 2D-Euler systems and also provides quantitative stability results of a large class of steady-states to this system in a natural energy space.

  16. Dynamic Quantitative T1 Mapping in Orthotopic Brain Tumor Xenografts

    Directory of Open Access Journals (Sweden)

    Kelsey Herrmann

    2016-04-01

    Full Text Available Human brain tumors such as glioblastomas are typically detected using conventional, nonquantitative magnetic resonance imaging (MRI techniques, such as T2-weighted and contrast enhanced T1-weighted MRI. In this manuscript, we tested whether dynamic quantitative T1 mapping by MRI can localize orthotopic glioma tumors in an objective manner. Quantitative T1 mapping was performed by MRI over multiple time points using the conventional contrast agent Optimark. We compared signal differences to determine the gadolinium concentration in tissues over time. The T1 parametric maps made it easy to identify the regions of contrast enhancement and thus tumor location. Doubling the typical human dose of contrast agent resulted in a clearer demarcation of these tumors. Therefore, T1 mapping of brain tumors is gadolinium dose dependent and improves detection of tumors by MRI. The use of T1 maps provides a quantitative means to evaluate tumor detection by gadolinium-based contrast agents over time. This dynamic quantitative T1 mapping technique will also enable future quantitative evaluation of various targeted MRI contrast agents.

  17. Teaching Quantitative Reasoning: A Better Context for Algebra

    Directory of Open Access Journals (Sweden)

    Eric Gaze

    2014-01-01

    Full Text Available This editorial questions the preeminence of algebra in our mathematics curriculum. The GATC (Geometry, Algebra, Trigonometry, Calculus sequence abandons the fundamental middle school math topics necessary for quantitative literacy, while the standard super-abundance of algebra taught in the abstract fosters math phobia and supports a culturally acceptable stance that math is not relevant to everyday life. Although GATC is seen as a pipeline to STEM (Science, Technology, Engineering, Mathematics, it is a mistake to think that the objective of producing quantitatively literate citizens is at odds with creating more scientists and engineers. The goal must be to create a curriculum that addresses the quantitative reasoning needs of all students, providing meaningful engagement in mathematics that will simultaneously develop quantitative literacy and spark an interest in STEM fields. In my view, such a curriculum could be based on a foundation of proportional reasoning leading to higher-order quantitative reasoning via modeling (including algebraic reasoning and problem solving and statistical literacy (through the exploration and study of data.

  18. Extended Rearrangement Inequalities and Applications to Some Quantitative Stability Results

    Science.gov (United States)

    Lemou, Mohammed

    2016-12-01

    In this paper, we prove a new functional inequality of Hardy-Littlewood type for generalized rearrangements of functions. We then show how this inequality provides quantitative stability results of steady states to evolution systems that essentially preserve the rearrangements and some suitable energy functional, under minimal regularity assumptions on the perturbations. In particular, this inequality yields a quantitative stability result of a large class of steady state solutions to the Vlasov-Poisson systems, and more precisely we derive a quantitative control of the L 1 norm of the perturbation by the relative Hamiltonian (the energy functional) and rearrangements. A general non linear stability result has been obtained by Lemou et al. (Invent Math 187:145-194, 2012) in the gravitational context, however the proof relied in a crucial way on compactness arguments which by construction provides no quantitative control of the perturbation. Our functional inequality is also applied to the context of 2D-Euler systems and also provides quantitative stability results of a large class of steady-states to this system in a natural energy space.

  19. [Bibliometric analysis of bacterial quantitative proteomics in English literatures].

    Science.gov (United States)

    Zhang, Xin; She, Danyang; Liu, Youning; Wang, Rui; Di, Xiuzhen; Liang, Beibei; Wang, Yue

    2014-07-01

    To analyze the worldwide advances on bacterial quantitative proteomics over the past fifteen years with bibliometric approach. Literature retrieval was conducted throughout the databases of Pubmed, Embase and Science citation index (SCI), using "bacterium" and "quantitative proteomics" as the key words. The deadline is July 2013. We sorted and analyzed these articles with Endnote X6 from the aspects of published year, the first author, name of journal, published institution, cited frequency and publication type. 932 English articles were included in our research after deleting the duplicates. The first article on bacterial quantitative proteomics was reported in 1999. The maximal publications were 163 related articles in 2012. Up till July 2013, authors from more than 23 countries and regions have published articles in this field. China ranks the fourth. The main publication type is original articles. The most frequently cited article is entitled with "Absolute quantification of proteins by LCMSE: a virtue of parallel MS acquisition" by Silva JC, Gorenstein MV, Li GZ, et al in Mol Cell Proteomics 2006. The most productive author is Smith RD from Biological Sciences Division, Pac. Northwest National Laboratory. The top journal publishing bacterial quantitative proteomics is Proteomics. More and more researchers pay attention to quantitative proteomics which will be widely used in bacteriology.

  20. Complex genetic interactions in a quantitative trait locus.

    Directory of Open Access Journals (Sweden)

    Himanshu Sinha

    2006-02-01

    Full Text Available Whether in natural populations or between two unrelated members of a species, most phenotypic variation is quantitative. To analyze such quantitative traits, one must first map the underlying quantitative trait loci. Next, and far more difficult, one must identify the quantitative trait genes (QTGs, characterize QTG interactions, and identify the phenotypically relevant polymorphisms to determine how QTGs contribute to phenotype. In this work, we analyzed three Saccharomyces cerevisiae high-temperature growth (Htg QTGs (MKT1, END3, and RHO2. We observed a high level of genetic interactions among QTGs and strain background. Interestingly, while the MKT1 and END3 coding polymorphisms contribute to phenotype, it is the RHO2 3'UTR polymorphisms that are phenotypically relevant. Reciprocal hemizygosity analysis of the Htg QTGs in hybrids between S288c and ten unrelated S. cerevisiae strains reveals that the contributions of the Htg QTGs are not conserved in nine other hybrids, which has implications for QTG identification by marker-trait association. Our findings demonstrate the variety and complexity of QTG contributions to phenotype, the impact of genetic background, and the value of quantitative genetic studies in S. cerevisiae.

  1. Parts of the Whole: Quantitative Literacy on a Desert Island

    Directory of Open Access Journals (Sweden)

    Dorothy Wallace

    2015-07-01

    Full Text Available Some of the specific institutional problems faced by quantitative reasoning courses, programs and requirements arise from the fragile intellectual position of “quantitative reasoning” as an idea, or meme. The process of isolation and reintroduction explains both the proliferation of living species and the way in which some difficult ideas take their place in a culture. Using evolutionary explanations as metaphor and the Copernican revolution as an example of a difficult idea, we draw lessons that can be applied to the “quantitative reasoning” meme, including the function of the National Numeracy Network as an island of protected discourse favoring the growth of the QR meme. We conclude that the mission of the National Numeracy Network should focus on attributes of that island, and in particular extend the mission beyond being a network, to being an actual community.

  2. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    Science.gov (United States)

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  3. A Quantitative Evaluation Method for Transportation Network Efficiency

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-02-01

    Full Text Available The efficiency of a transportation network is the comprehensive performance index of the network. In order to evaluate the operation situation of the transportation network objectively, the scientific quantitative evaluation method for the network efficiency is very important. In this study, a new quantitative evaluation method for transportation network efficiency is developed, which could evaluate the transportation network performance comprehensively and reasonably. The method is defined in the context of network equilibrium, which could reflect the influences of travel behavior, travel cost, travel demands and link flows, all in equilibrium state, on network efficiency. The computation results compared with a previously proposed one by numerical example, which denote that the new method can quantitatively reflect the influence on the transportation network efficiency induced by traffic flows, user behavior and network structure, which accords with the practical situation.

  4. Quantitative proteomics by amino acid labeling in C. elegans

    DEFF Research Database (Denmark)

    Fredens, Julius; Engholm-Keller, Kasper; Giessing, Anders;

    2011-01-01

    We demonstrate labeling of Caenorhabditis elegans with heavy isotope-labeled lysine by feeding them with heavy isotope-labeled Escherichia coli. Using heavy isotope-labeled worms and quantitative proteomics methods, we identified several proteins that are regulated in response to loss or RNAi-med......-mediated knockdown of the nuclear hormone receptor 49 in C. elegans. The combined use of quantitative proteomics and selective gene knockdown is a powerful tool for C. elegans biology.......We demonstrate labeling of Caenorhabditis elegans with heavy isotope-labeled lysine by feeding them with heavy isotope-labeled Escherichia coli. Using heavy isotope-labeled worms and quantitative proteomics methods, we identified several proteins that are regulated in response to loss or RNAi...

  5. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  6. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  7. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  8. Design and Statistics in Quantitative Translation (Process) Research

    DEFF Research Database (Denmark)

    Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

    2015-01-01

    is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale......Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....

  9. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  10. Quantitative coherent anti-Stokes Raman scattering (CARS) microscopy.

    Science.gov (United States)

    Day, James P R; Domke, Katrin F; Rago, Gianluca; Kano, Hideaki; Hamaguchi, Hiro-o; Vartiainen, Erik M; Bonn, Mischa

    2011-06-23

    The ability to observe samples qualitatively at the microscopic scale has greatly enhanced our understanding of the physical and biological world throughout the 400 year history of microscopic imaging, but there are relatively few techniques that can truly claim the ability to quantify the local concentration and composition of a sample. We review coherent anti-Stokes Raman scattering (CARS) as a quantitative, chemically specific, and label-free microscopy. We discuss the complicating influence of the nonresonant response on the CARS signal and the various experimental and mathematical approaches that can be adopted to extract quantitative information from CARS. We also review the uses to which CARS has been employed as a quantitative microscopy to solve challenges in material and biological science.

  11. Quantitative aspects of inductively coupled plasma mass spectrometry

    Science.gov (United States)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  12. The Emergence of Quantitative Sintering Theory from 1945 to 1955

    Science.gov (United States)

    German, Randall M.

    2017-04-01

    Particles flow and pack under stress, allowing shaping of the particles into target engineering geometries. Subsequently, in a process termed sintering, the particles are heated to induce bonding that results in a strong solid. Although first practiced 26,000 years ago, sintering was largely unexplained until recent times. Sintering science moved from an empirical and largely qualitative notion into a quantitative theory over a relatively short time period following World War II. That conceptual transition took place just as commercial applications for sintered materials underwent significant growth. This article highlights the key changes in sintering concepts that occurred in the 1945-1955 time period. This time span starts with the first quantitative neck growth model from Frenkel and ends with the quantitative shrinkage model from Kingery and Berg that includes several transport mechanisms.

  13. New developments in quantitative risk assessment of campylobacteriosis

    DEFF Research Database (Denmark)

    Havelaar, Arie; Nauta, Maarten

    the formulation of quantitative criteria for broilers and other meats. New modelling methods (e.g. Bayesian approaches) are being proposed, but are currently less detailed than conventional Monte Carlo simulation models. There is considerable uncertainty in dose-response relationships and in particular, current......Quantitative microbiological risk assessment (QMRA) is now broadly accepted as an important decision support tool in food safety risk management. It has been used to support decision making at the global level (Codex Alimentarius, FAO and WHO), at the European level (European Food Safety Authority...... go undetected after on-farm monitoring. Processing models indicate that defeathering and evisceration are key processes leading to carcass contamination, but there is still considerable uncertainty of the quantitative effects of these processes. In the kitchen, cross-contamination from contaminated...

  14. Quantitative finance and risk management a physicist's approach

    CERN Document Server

    Dash, Jan W

    2004-01-01

    Written by a physicist with over 15 years of experience as a quant on Wall Street, this book treats a wide variety of topics. Presenting the theory and practice of quantitative finance and risk, it delves into the "how to" and "what it's like" aspects not covered in textbooks or research papers. Both standard and new results are presented. A "Technical Index" indicates the mathematical level; from zero to PhD mathematical background; for each section. The finance aspect in each section is self-contained. Real-life comments on "life as a quant" are included. This book is designed for scientists and engineers desiring to learn quantitative finance, and for quantitative analysts and finance graduate students. Parts will be of interest to research academics.

  15. Quantitative shadowgraphy and proton radiography for large intensity modulations

    CERN Document Server

    Kasim, Muhammad Firmansyah; Ratan, Naren; Sadler, James; Chen, Nicholas; Savert, Alexander; Trines, Raoul; Bingham, Robert; Burrows, Philip N; Kaluza, Malte C; Norreys, Peter

    2016-01-01

    Shadowgraphy is a technique widely used to diagnose objects or systems in various fields in physics and engineering. In shadowgraphy, an optical beam is deflected by the object and then the intensity modulation is captured on a screen placed some distance away. However, retrieving quantitative information from the shadowgrams themselves is a challenging task because of the non-linear nature of the process. Here, a novel method to retrieve quantitative information from shadowgrams, based on computational geometry, is presented for the first time. This process can be applied to proton radiography for electric and magnetic field diagnosis in high-energy-density plasmas and has been benchmarked using a toroidal magnetic field as the object, among others. It is shown that the method can accurately retrieve quantitative parameters with error bars less than 10%, even when caustics are present. The method is also shown to be robust enough to process real experimental results with simple pre- and post-processing techn...

  16. Prognostic, quantitative histopathologic variables in lobular carcinoma of the breast

    DEFF Research Database (Denmark)

    Ladekarl, M; Sørensen, Flemming Brandt

    1993-01-01

    BACKGROUND: A retrospective investigation of 53 consecutively treated patients with operable lobular carcinoma of the breast, with a median follow-up of 6.6 years, was performed to examine the prognostic value of quantitative histopathologic parameters. METHODS: The measurements were performed...... in routinely processed histologic sections using a simple, unbiased technique for the estimation of the three-dimensional mean nuclear volume (vv(nuc)). In addition, quantitative estimates were obtained of the mitotic index (MI), the nuclear index (NI), the nuclear volume fraction (Vv(nuc/tis)), and the mean...... of disease, vv(nuc), MI, and NI were of significant independent, prognostic value. On the basis of the multivariate analyses, a prognostic index with highly distinguishing capacity between prognostically poor and favorable cases was constructed. CONCLUSION: Quantitative histopathologic variables are of value...

  17. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    Science.gov (United States)

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  18. A Quantitative Assessment Approach to COTS Component Security

    Directory of Open Access Journals (Sweden)

    Jinfu Chen

    2013-01-01

    Full Text Available The vulnerability of software components hinders the development of component technology. An effective assessment approach to component security level can promote the development of component technology. Thus, the current paper proposes a quantitative assessment approach to COTS (commercial-off-the-shelf component security. The steps of interface fault injection and the assessment framework are given based on the internal factors of the tested component. The quantitative assessment algorithm and formula of component security level are also presented. The experiment results show that the approach not only can detect component security vulnerabilities effectively but also quantitatively assess the component security level. The score of component security can be accurately calculated, which represents the security level of the tested component.

  19. Quantitative proteomics to study carbapenem resistance in Acinetobacter baumannii.

    Science.gov (United States)

    Tiwari, Vishvanath; Tiwari, Monalisa

    2014-01-01

    Acinetobacter baumannii is an opportunistic pathogen causing pneumonia, respiratory infections and urinary tract infections. The prevalence of this lethal pathogen increases gradually in the clinical setup where it can grow on artificial surfaces, utilize ethanol as a carbon source. Moreover it resists desiccation. Carbapenems, a β-lactam, are the most commonly prescribed drugs against A. baumannii. Resistance against carbapenem has emerged in Acinetobacter baumannii which can create significant health problems and is responsible for high morbidity and mortality. With the development of quantitative proteomics, a considerable progress has been made in the study of carbapenem resistance of Acinetobacter baumannii. Recent updates showed that quantitative proteomics has now emerged as an important tool to understand the carbapenem resistance mechanism in Acinetobacter baumannii. Present review also highlights the complementary nature of different quantitative proteomic methods used to study carbapenem resistance and suggests to combine multiple proteomic methods for understanding the response to antibiotics by Acinetobacter baumannii.

  20. Quantitative Proteomics to study Carbapenem Resistance in Acinetobacter baumannii

    Directory of Open Access Journals (Sweden)

    Vishvanath eTiwari

    2014-09-01

    Full Text Available Acinetobacter baumannii is an opportunistic pathogen causing pneumonia, respiratory infections and urinary tract infections. The prevalence of this lethal pathogen increases gradually in the clinical setup where it can grow on artificial surfaces, utilize ethanol as a carbon source. Moreover it resists desiccation. Carbapenems, a β-lactam, are the most commonly prescribed drugs against A. baumannii. Resistance against carbapenem has emerged in Acinetobacter baumannii which can create significant health problems and is responsible for high morbidity & mortality. With the development of quantitative proteomics, a considerable progress has been made in the study of carbapenem resistance of Acinetobacter baumannii. Recent updates showed that quantitative proteomics has now emerged as an important tool to understand the carbapenem resistance mechanism in Acinetobacter baumannii. Present review also highlights the complementary nature of different quantitative proteomic methods used to study carbapenem resistance and suggests to combine multiple proteomic methods for understanding the response to antibiotics by Acinetobacter baumannii.