WorldWideScience

Sample records for model quantitative analysis

  1. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    The majority of modern software and hardware systems are reactive systems, where input provided by the user (possibly another system) and the output of the system is exchanged continuously throughout the (possibly) indefinite execution of the system. Natural examples include control systems, mobi......, energy consumption, latency, mean-time to failure, and cost. For systems integrated in mass-market products, the ability to quantify trade-offs between performance and robustness, under given technical and economic constraints, is of strategic importance....... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, in terms of a new mathematical basis for systems modeling which can incompas behavioural properties as well as environmental constraints. They continue by pointing out that, continuous performance and robustness measures are paramount when dealing with physical resource levels such as clock frequency...

  2. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  3. Frequency-Domain Response Analysis for Quantitative Systems Pharmacology Models.

    Science.gov (United States)

    Schulthess, Pascal; Post, Teun M; Yates, James; van der Graaf, Piet H

    2017-11-28

    Drug dosing regimen can significantly impact drug effect and, thus, the success of treatments. Nevertheless, trial and error is still the most commonly used method by conventional pharmacometric approaches to optimize dosing regimen. In this tutorial, we utilize four distinct classes of quantitative systems pharmacology models to introduce frequency-domain response analysis, a method widely used in electrical and control engineering that allows the analytical optimization of drug treatment regimen from the dynamics of the model. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  4. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  5. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  6. Quantitative analysis of a wind energy conversion model

    Science.gov (United States)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-03-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s-1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is cp = 0.15. The v3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively.

  7. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy......Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  8. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  9. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  10. Quantitative analysis of crossflow model of the COBRA-IV.1 code

    International Nuclear Information System (INIS)

    Lira, C.A.B.O.

    1983-01-01

    Based on experimental data in a rod bundle test section, the crossflow model of the COBRA-IV.1 code was quantitatively analysed. The analysis showed that is possible to establish some operational conditions in which the results of the theoretical model are acceptable. (author) [pt

  11. Development of probabilistic models for quantitative pathway analysis of plant pests introduction for the EU territory

    NARCIS (Netherlands)

    Douma, J.C.; Robinet, C.; Hemerik, L.; Mourits, M.C.M.; Roques, A.; Werf, van der W.

    2015-01-01

    The aim of this report is to provide EFSA with probabilistic models for quantitative pathway analysis of plant pest introduction for the EU territory through non-edible plant products or plants. We first provide a conceptualization of two types of pathway models. The individual based PM simulates an

  12. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  13. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  14. Study on quantitative reliability analysis by multilevel flow models for nuclear power plants

    International Nuclear Information System (INIS)

    Yang Ming; Zhang Zhijian

    2011-01-01

    Multilevel Flow Models (MFM) is a goal-oriented system modeling method. MFM explicitly describes how a system performs the required functions under stated conditions for a stated period of time. This paper presents a novel system reliability analysis method based on MFM (MRA). The proposed method allows describing the system knowledge at different levels of abstraction which makes the reliability model easy for understanding, establishing, modifying and extending. The success probabilities of all main goals and sub-goals can be available by only one-time quantitative analysis. The proposed method is suitable for the system analysis and scheme comparison for complex industrial systems such as nuclear power plants. (authors)

  15. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    operation will be changed by various parameters of DERs. This article proposed a modelling framework for an overview analysis on the correlation between DERs. Furthermore, to validate the framework, the authors described the reference models of different categories of DERs with their unique characteristics......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...

  16. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  17. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Maurice H. ter Beek

    2015-04-01

    Full Text Available We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLan with action rates, which specify the likelihood of exhibiting particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLan allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLan semantics based on discrete-time Markov chains. The Maude implementation of PFLan is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning and the expected average cost of products.

  18. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... whose operational behaviour interacts with a store of constraints, neatly separating product configuration from product behaviour. The resulting probabilistic configurations and behaviour converge seamlessly in a semantics based on DTMCs, thus enabling quantitative analyses ranging from the likelihood...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  19. [Study on temperature correctional models of quantitative analysis with near infrared spectroscopy].

    Science.gov (United States)

    Zhang, Jun; Chen, Hua-cai; Chen, Xing-dan

    2005-06-01

    Effect of enviroment temperature on near infrared spectroscopic quantitative analysis was studied. The temperature correction model was calibrated with 45 wheat samples at different environment temperaturs and with the temperature as an external variable. The constant temperature model was calibated with 45 wheat samples at the same temperature. The predicted results of two models for the protein contents of wheat samples at different temperatures were compared. The results showed that the mean standard error of prediction (SEP) of the temperature correction model was 0.333, but the SEP of constant temperature (22 degrees C) model increased as the temperature difference enlarged, and the SEP is up to 0.602 when using this model at 4 degrees C. It was suggested that the temperature correctional model improves the analysis precision.

  20. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  1. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  2. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    Science.gov (United States)

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  3. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  4. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    Science.gov (United States)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  5. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  6. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  7. Quantitative Analysis of Situation Awareness (QASA): modelling and measuring situation awareness using signal detection theory.

    Science.gov (United States)

    Edgar, Graham K; Catherwood, Di; Baker, Steven; Sallis, Geoff; Bertels, Michael; Edgar, Helen E; Nikolla, Dritan; Buckle, Susanna; Goodwin, Charlotte; Whelan, Allana

    2017-12-29

    This paper presents a model of situation awareness (SA) that emphasises that SA is necessarily built using a subset of available information. A technique (Quantitative Analysis of Situation Awareness - QASA), based around signal detection theory, has been developed from this model that provides separate measures of actual SA (ASA) and perceived SA (PSA), together with a feature unique to QASA, a measure of bias (information acceptance). These measures allow the exploration of the relationship between actual SA, perceived SA and information acceptance. QASA can also be used for the measurement of dynamic ASA, PSA and bias. Example studies are presented and full details of the implementation of the QASA technique are provided. Practitioner Summary: This paper presents a new model of situation awareness (SA) together with an associated tool (Quantitative Analysis of Situation Awareness - QASA) that employs signal detection theory to measure several aspects of SA, including actual and perceived SA and information acceptance. Full details are given of the implementation of the tool.

  8. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  9. Modelling and Quantitative Analysis of LTRACK–A Novel Mobility Management Algorithm

    Directory of Open Access Journals (Sweden)

    Benedek Kovács

    2006-01-01

    Full Text Available This paper discusses the improvements and parameter optimization issues of LTRACK, a recently proposed mobility management algorithm. Mathematical modelling of the algorithm and the behavior of the Mobile Node (MN are used to optimize the parameters of LTRACK. A numerical method is given to determine the optimal values of the parameters. Markov chains are used to model both the base algorithm and the so-called loop removal effect. An extended qualitative and quantitative analysis is carried out to compare LTRACK to existing handover mechanisms such as MIP, Hierarchical Mobile IP (HMIP, Dynamic Hierarchical Mobility Management Strategy (DHMIP, Telecommunication Enhanced Mobile IP (TeleMIP, Cellular IP (CIP and HAWAII. LTRACK is sensitive to network topology and MN behavior so MN movement modelling is also introduced and discussed with different topologies. The techniques presented here can not only be used to model the LTRACK algorithm, but other algorithms too. There are many discussions and calculations to support our mathematical model to prove that it is adequate in many cases. The model is valid on various network levels, scalable vertically in the ISO-OSI layers and also scales well with the number of network elements.

  10. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    Science.gov (United States)

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom

  11. Economic analysis of light brown apple moth using GIS and quantitative modeling

    Science.gov (United States)

    Glenn Fowler; Lynn Garrett; Alison Neeley; Roger Magarey; Dan Borchert; Brian. Spears

    2011-01-01

    We conducted an economic analysis of the light brown apple moth (LBAM), (piphyas postvittana (Walker)), whose presence in California has resulted in a regulatory program. Our objective was to quantitatively characterize the economic costs to apple, grape, orange, and pear crops that would result from LBAM's introduction into the continental...

  12. Quantitative microleakage analysis of endodontic temporary filling materials using a glucose penetration model.

    Science.gov (United States)

    Kim, Sin-Young; Ahn, Jin-Soo; Yi, Young-Ah; Lee, Yoon; Hwang, Ji-Yun; Seo, Deog-Gyu

    2015-02-01

    The purpose of this study was to analyze the sealing ability of different temporary endodontic materials over a 6-week period using a glucose penetration model. Standardized holes were formed on 48 dentin discs from human premolars. The thicknesses of the specimens were distributed evenly to 2 mm, 3 mm and 4 mm. Prepared dentin specimens were randomly assigned into six groups (n = 7) and the holes in the dentin specimens were filled with two kinds of temporary filling materials as per the manufacturers' instructions as follows: Caviton (GC Corporation, Tokyo, Japan) 2 mm, 3 mm, 4 mm and IRM (Dentsply International Inc., Milford, DE) 2 mm, 3 mm, 4 mm. The remaining specimens were used as positive and negative controls and all specimens underwent thermocycling (1000; 5-55°C). The sealing ability of all samples was evaluated using the leakage model for glucose. The samples were analyzed by a spectrophotometer in quantitative glucose microleakage test over a period of 6 weeks. As a statistical inference, a mixed effect analysis was applied to analyze serial measurements over time. The Caviton groups showed less glucose penetration in comparison with the IRM groups. The Caviton 4 mm group demonstrated relatively low glucose leakage over the test period. High glucose leakage was detected throughout the test period in all IRM groups. The glucose leakage level increased after 1 week in the Caviton 2 mm group and after 4 weeks in the Caviton 3 mm and 4 mm groups (p penetration model during 6 weeks. Temporary filling of Caviton to at least 3 mm in thickness is necessary and temporary filling periods should not exceed 4 weeks.

  13. Statistical Modeling Approach to Quantitative Analysis of Interobserver Variability in Breast Contouring

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jinzhong, E-mail: jyang4@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A.; Reed, Valerie K.; Strom, Eric A.; Perkins, George H.; Tereffe, Welela; Buchholz, Thomas A. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhang, Lifei; Balter, Peter; Court, Laurence E. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dong, Lei [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Scripps Proton Therapy Center, San Diego, California (United States)

    2014-05-01

    Purpose: To develop a new approach for interobserver variability analysis. Methods and Materials: Eight radiation oncologists specializing in breast cancer radiation therapy delineated a patient's left breast “from scratch” and from a template that was generated using deformable image registration. Three of the radiation oncologists had previously received training in Radiation Therapy Oncology Group consensus contouring for breast cancer atlas. The simultaneous truth and performance level estimation algorithm was applied to the 8 contours delineated “from scratch” to produce a group consensus contour. Individual Jaccard scores were fitted to a beta distribution model. We also applied this analysis to 2 or more patients, which were contoured by 9 breast radiation oncologists from 8 institutions. Results: The beta distribution model had a mean of 86.2%, standard deviation (SD) of ±5.9%, a skewness of −0.7, and excess kurtosis of 0.55, exemplifying broad interobserver variability. The 3 RTOG-trained physicians had higher agreement scores than average, indicating that their contours were close to the group consensus contour. One physician had high sensitivity but lower specificity than the others, which implies that this physician tended to contour a structure larger than those of the others. Two other physicians had low sensitivity but specificity similar to the others, which implies that they tended to contour a structure smaller than the others. With this information, they could adjust their contouring practice to be more consistent with others if desired. When contouring from the template, the beta distribution model had a mean of 92.3%, SD ± 3.4%, skewness of −0.79, and excess kurtosis of 0.83, which indicated a much better consistency among individual contours. Similar results were obtained for the analysis of 2 additional patients. Conclusions: The proposed statistical approach was able to measure interobserver variability quantitatively

  14. Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships (MCBIOS)

    Science.gov (United States)

    Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships Jie Liu1,2, Richard Judson1, Matthew T. Martin1, Huixiao Hong3, Imran Shah1 1National Center for Computational Toxicology (NCCT), US EPA, RTP, NC...

  15. Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering

    Science.gov (United States)

    2014-06-01

    shuttle catastrophe (Rogers 1986). Existing analysis supported that the launch conditions (specifically ambient temperature ) should have resulted...or software package. For example, an MBSE effort towards building a CubeSat (a small satellite) integrated a variety of tools ranging from Microsoft

  16. A quantitative analysis of instabilities in the linear chiral sigma model

    International Nuclear Information System (INIS)

    Nemes, M.C.; Nielsen, M.; Oliveira, M.M. de; Providencia, J. da

    1990-08-01

    We present a method to construct a complete set of stationary states corresponding to small amplitude motion which naturally includes the continuum solution. The energy wheighted sum rule (EWSR) is shown to provide for a quantitative criterium on the importance of instabilities which is known to occur in nonasymptotically free theories. Out results for the linear σ model showed be valid for a large class of models. A unified description of baryon and meson properties in terms of the linear σ model is also given. (author)

  17. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    Science.gov (United States)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  18. Quantitative analysis of anaerobic oxidation of methane (AOM) in marine sediments: A modeling perspective

    Science.gov (United States)

    Regnier, P.; Dale, A. W.; Arndt, S.; LaRowe, D. E.; Mogollón, J.; Van Cappellen, P.

    2011-05-01

    Recent developments in the quantitative modeling of methane dynamics and anaerobic oxidation of methane (AOM) in marine sediments are critically reviewed. The first part of the review begins with a comparison of alternative kinetic models for AOM. The roles of bioenergetic limitations, intermediate compounds and biomass growth are highlighted. Next, the key transport mechanisms in multi-phase sedimentary environments affecting AOM and methane fluxes are briefly treated, while attention is also given to additional controls on methane and sulfate turnover, including organic matter mineralization, sulfur cycling and methane phase transitions. In the second part of the review, the structure, forcing functions and parameterization of published models of AOM in sediments are analyzed. The six-orders-of-magnitude range in rate constants reported for the widely used bimolecular rate law for AOM emphasizes the limited transferability of this simple kinetic model and, hence, the need for more comprehensive descriptions of the AOM reaction system. The derivation and implementation of more complete reaction models, however, are limited by the availability of observational data. In this context, we attempt to rank the relative benefits of potential experimental measurements that should help to better constrain AOM models. The last part of the review presents a compilation of reported depth-integrated AOM rates (ΣAOM). These rates reveal the extreme variability of ΣAOM in marine sediments. The model results are further used to derive quantitative relationships between ΣAOM and the magnitude of externally impressed fluid flow, as well as between ΣAOM and the depth of the sulfate-methane transition zone (SMTZ). This review contributes to an improved understanding of the global significance of the AOM process, and helps identify outstanding questions and future directions in the modeling of methane cycling and AOM in marine sediments.

  19. Quantitative global sensitivity analysis of a biologically based dose-response pregnancy model for the thyroid endocrine system.

    Science.gov (United States)

    Lumen, Annie; McNally, Kevin; George, Nysia; Fisher, Jeffrey W; Loizou, George D

    2015-01-01

    A deterministic biologically based dose-response model for the thyroidal system in a near-term pregnant woman and the fetus was recently developed to evaluate quantitatively thyroid hormone perturbations. The current work focuses on conducting a quantitative global sensitivity analysis on this complex model to identify and characterize the sources and contributions of uncertainties in the predicted model output. The workflow and methodologies suitable for computationally expensive models, such as the Morris screening method and Gaussian Emulation processes, were used for the implementation of the global sensitivity analysis. Sensitivity indices, such as main, total and interaction effects, were computed for a screened set of the total thyroidal system descriptive model input parameters. Furthermore, a narrower sub-set of the most influential parameters affecting the model output of maternal thyroid hormone levels were identified in addition to the characterization of their overall and pair-wise parameter interaction quotients. The characteristic trends of influence in model output for each of these individual model input parameters over their plausible ranges were elucidated using Gaussian Emulation processes. Through global sensitivity analysis we have gained a better understanding of the model behavior and performance beyond the domains of observation by the simultaneous variation in model inputs over their range of plausible uncertainties. The sensitivity analysis helped identify parameters that determine the driving mechanisms of the maternal and fetal iodide kinetics, thyroid function and their interactions, and contributed to an improved understanding of the system modeled. We have thus demonstrated the use and application of global sensitivity analysis for a biologically based dose-response model for sensitive life-stages such as pregnancy that provides richer information on the model and the thyroidal system modeled compared to local sensitivity analysis.

  20. Quantitative analysis of 'calanchi

    Science.gov (United States)

    Agnesi, Valerio; Cappadonia, Chiara; Conoscenti, Christian; Costanzo, Dario; Rotigliano, Edoardo

    2010-05-01

    Three years (2006 - 2009) of monitoring data from two calanchi sites located in the western Sicilian Appennines are analyzed and discussed: the data comes from two networks of erosion pins and a rainfall gauge station. The aim of the present research is to quantitatively analyze the effects of erosion by water and to investigate their relationships with rainfall trends and specific properties of the two calanchi fronts. Each of the sites was equipped with a grid of randomly distributed erosion pins, made of 41 nodes for the "Catalfimo" site, and 13 nodes for the "Ottosalme" site (in light of the general homogeneity of its geomorphologic conditions); the erosion pins consist in 2 cm graded iron stakes, 100 cm long, with a section having a diameter of 1.6 cm. Repeated readings at the erosion pins allowed to estimate point topographic height variations; a total number of 21 surveys have been made remotely by acquiring high resolution photographs from a fixed view point. Since the two calanchi sites are very close each other (some hundred meters), a single rainfall gauge station was installed, assuming a strict climatic homogeneity of the investigated area. Rainfall data have been processed to derive the rain erosivity index signal, detecting a total number of 27 erosive events. Despite the close distance between the two sites, because of a different geologic setting, the calanchi fronts are characterized by the outcropping of different levels of the same formation (Terravecchia fm., Middle-Late Miocene); as a consequence, both mineralogical, textural and geotechnical (index) properties, as well as the topographic and geomorphologic characteristics, change. Therefore, in order to define the "framework" in which the two erosion pin grids have been installed, 40 samples of rock have been analyzed, and a geomorphologic detailed survey has been carried out; in particular, plasticity index, liquid limit, carbonate, pH, granulometric fractions and their mineralogic

  1. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  2. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  3. Comparative Analysis of Single-Species and Polybacterial Wound Biofilms Using a Quantitative, In Vivo, Rabbit Ear Model

    Science.gov (United States)

    2012-08-08

    biofilm behavior of mixed-species cultures with dental and periodontal pathogens. PLoS One 5(10): 131–135. 47. Ma H, Bryers JD (2010) Non-invasive method...Comparative Analysis of Single-Species and Polybacterial Wound Biofilms Using a Quantitative, In Vivo, Rabbit Ear Model Akhil K. Seth1*, Matthew R...Northwestern University, Chicago, Illinois, United States of America, 2 Microbiology Branch, US Army Dental and Trauma Research Detachment, Institute of Surgical

  4. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  5. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  6. Quantitative Analysis of Epithelial Morphogenesis in Drosophila Oogenesis: New Insights Based on Morphometric Analysis and Mechanical Modeling

    Science.gov (United States)

    White, P.F.; Shreter, D.M.; Kolahi, K.S.; Classen, A.; Bilder, D.; Mofrad, M.R.K.

    2011-01-01

    The process of epithelial morphogenesis is ubiquitous in animal development, but much remains to be learned about the mechanisms that shape epithelial tissues. The follicle cell (FC) epithelium encapsulating the growing germline of Drosophila is an excellent system to study fundamental elements of epithelial development. During stages 8 to 10 of oogenesis, the FC epithelium transitions between simple geometries - cuboidal, columnar and squamous – and redistributes cell populations in processes described as posterior migration, squamous cell flattening and main body cell columnarization. Here we have carried out a quantitative morphometric analysis of these poorly understood events in order to establish the parameters of and delimit the potential processes that regulate the transitions. Our results compel a striking revision of accepted views of these phenomena, by showing that posterior migration does not involve FC movements, that there is no role for columnar cell apical constriction in FC morphogenesis, and that squamous cell flattening may be a compliant response to germline growth. We utilize mechanical modeling involving finite element computational technologies to demonstrate that time-varying viscoelastic properties and growth are sufficient to account for the bulk of the FC morphogenetic changes. PMID:19409378

  7. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    Science.gov (United States)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    , the knowledge generated from these studies cannot be easily generalized or transferred to other basins. Here, we present an approach to integrate the quantitative and qualitative methods to study water issues and capture the contextual knowledge of water management- by combining the NSSs framework and an area of artificial intelligence called qualitative reasoning. Using the Apalachicola-Chattahoochee-Flint (ACF) River Basin dispute as an example, we demonstrate how quantitative modeling and qualitative reasoning can be integrated to examine the impact of over abstraction of water from the river on the ecosystem and the role of governance in shaping the evolution of the ACF water dispute.

  8. MetabR: an R script for linear model analysis of quantitative metabolomic data

    Directory of Open Access Journals (Sweden)

    Ernest Ben

    2012-10-01

    Full Text Available Abstract Background Metabolomics is an emerging high-throughput approach to systems biology, but data analysis tools are lacking compared to other systems level disciplines such as transcriptomics and proteomics. Metabolomic data analysis requires a normalization step to remove systematic effects of confounding variables on metabolite measurements. Current tools may not correctly normalize every metabolite when the relationships between each metabolite quantity and fixed-effect confounding variables are different, or for the effects of random-effect confounding variables. Linear mixed models, an established methodology in the microarray literature, offer a standardized and flexible approach for removing the effects of fixed- and random-effect confounding variables from metabolomic data. Findings Here we present a simple menu-driven program, “MetabR”, designed to aid researchers with no programming background in statistical analysis of metabolomic data. Written in the open-source statistical programming language R, MetabR implements linear mixed models to normalize metabolomic data and analysis of variance (ANOVA to test treatment differences. MetabR exports normalized data, checks statistical model assumptions, identifies differentially abundant metabolites, and produces output files to help with data interpretation. Example data are provided to illustrate normalization for common confounding variables and to demonstrate the utility of the MetabR program. Conclusions We developed MetabR as a simple and user-friendly tool for implementing linear mixed model-based normalization and statistical analysis of targeted metabolomic data, which helps to fill a lack of available data analysis tools in this field. The program, user guide, example data, and any future news or updates related to the program may be found at http://metabr.r-forge.r-project.org/.

  9. Quantitative Analysis of Memristance Defined Exponential Model for Multi-bits Titanium Dioxide Memristor Memory Cell

    Directory of Open Access Journals (Sweden)

    DAOUD, A. A. D.

    2016-05-01

    Full Text Available The ability to store multiple bits in a single memristor based memory cell is a key feature for high-capacity memory packages. Studying multi-bit memristor circuits requires high accuracy in modelling the memristance change. A memristor model based on a novel definition of memristance is proposed. A design of a single memristor memory cell using the proposed model for the platinum electrodes titanium dioxide memristor is illustrated. A specific voltage pulse is used with varying its parameters (amplitude or pulse width to store different number of states in a single memristor. New state variation parameters associated with the utilized model are provided and their effects on write and read processes of memristive multi-states are analysed. PSPICE simulations are also held, and they show a good agreement with the data obtained from the analysis.

  10. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  11. Quantitative analysis in megageomorphology

    Science.gov (United States)

    Mayer, L.

    1985-01-01

    Megageomorphology is the study of regional topographic features and their relations to independent geomorphic variables that operate at the regional scale. These independent variables can be classified as either tectonic or climatic in nature. Quantitative megageomorphology stresses the causal relations between plate tectonic factors and landscape features or correlations between climatic factors and geomorphic processes. In addition, the cumulative effects of tectonics and climate on landscape evolution that simultaneously operate in a complex system of energy transfer is of interst. Regional topographic differentiation, say between continents and ocean floors, is largely the result of the different densities and density contrasts within the oceanic and continental lithosphere and their isostatic consequences. Regional tectonic processes that alter these lithospheric characteristics include rifting, collision, subduction, transpression and transtension.

  12. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations......, occurrence and ordering of events, reward-based properties, and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover...... the entire BPMN language, allow for more complex annotations and ultimately to automatically synthesize workflows by composing predefined subprocesses, in order to achieve a configuration that is optimal for parameters of interest....

  13. Automated quantitative analysis to assess motor function in different rat models of impaired coordination and ataxia.

    Science.gov (United States)

    Kyriakou, Elisavet I; van der Kieft, Jan G; de Heer, Raymond C; Spink, Andrew; Nguyen, Huu Phuc; Homberg, Judith R; van der Harst, Johanneke E

    2016-08-01

    An objective and automated method for assessing alterations in gait and motor coordination in different animal models is important for proper gait analysis. The CatWalk system has been used in pain research, ischemia, arthritis, spinal cord injury and some animal models for neurodegenerative diseases. Our goals were to obtain a comprehensive gait analysis of three different rat models and to identify which motor coordination parameters are affected and are the most suitable and sensitive to describe and detect ataxia with a secondary focus on possible training effects. Both static and dynamic parameters showed significant differences in all three models: enriched housed rats show higher walking and swing speed and longer stride length, ethanol-induced ataxia affects mainly the hind part of the body, and the SCA17 rats show coordination disturbances. Coordination changes were revealed only in the case of the ethanol-induced ataxia and the SCA17 rat model. Although training affected some gait parameters, it did not obscure group differences when those were present. To our knowledge, a comparative gait assessment in rats with enriched housing conditions, ethanol-induced ataxia and SCA17 has not been presented before. There is no gold standard for the use of CatWalk. Dependent on the specific effects expected, the protocol can be adjusted. By including all sessions in the analysis, any training effect should be detectable and the development of the performance over the sessions can provide insight in effects attributed to intervention, treatment or injury. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Quantitative Hazard and Risk Analysis

    OpenAIRE

    Geza Tarnai; Balazs Saghi; Izabela Krbilova

    2006-01-01

    In this paper a quantitative method for hazard and risk analysis is discussed. The method was developed and introduced for the allocation of safety requirements to the functions of a railway signaling remote control system.

  15. Quantitative evaluation of small breast masses using a compartment model analysis on dynamic MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Ikeda, Osamu; Morishita, Shoji; Kido, Taeko; Kitajima, Mika; Okamura, Kenji; Fukuda, Seiji [Kumamoto Rosai Hospital, Yatsushiro (Japan); Yamashita, Yasuyuki; Takahashi, Mutsumasa

    1998-07-01

    To differentiate between malignant and benign breast masses using a compartmental analysis, 55 patients with breast masses (fibroadenoma, n=22; invasive ductal carcinoma, n=29; noninvasive ductal carcinoma, n=8) underwent Gd-DTPA enhanced dynamic MR imaging. Dynamic MR images obtained using two-dimensional fat-saturated fast multiplanar corrupted gradient echo technique over 10 minutes following bolus injection of Gd-DTPA. The triexponential concentration curve of Gd-DTPA was fitted to a theoretical model based on compartmental analysis. Using this method, the transfer constant (or permeability surface product per unit volume of component k) and f{sub 3}/f{sub 1}=f were measured, where f{sub 1} represents tumor vessel volume and f{sub 3} represents extracellular volume. The k value was significantly greater (p<0.01) for malignant tumors, and the k value seen in cases of noninvasive ductal carcinoma was less than that for invasive ductal carcinoma. The f value was significantly smaller (p<0.01) for malignant tumors, whereas the f value for noninvasive ductal carcinoma was not significantly different from that for invasive ductal carcinoma. We believe that this type of compartmental analysis may be of value for the evaluation of breast masses. (author)

  16. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  17. Temperature control of fimbriation circuit switch in uropathogenic Escherichia coli: quantitative analysis via automated model abstraction.

    Directory of Open Access Journals (Sweden)

    Hiroyuki Kuwahara

    2010-03-01

    Full Text Available Uropathogenic Escherichia coli (UPEC represent the predominant cause of urinary tract infections (UTIs. A key UPEC molecular virulence mechanism is type 1 fimbriae, whose expression is controlled by the orientation of an invertible chromosomal DNA element-the fim switch. Temperature has been shown to act as a major regulator of fim switching behavior and is overall an important indicator as well as functional feature of many urologic diseases, including UPEC host-pathogen interaction dynamics. Given this panoptic physiological role of temperature during UTI progression and notable empirical challenges to its direct in vivo studies, in silico modeling of corresponding biochemical and biophysical mechanisms essential to UPEC pathogenicity may significantly aid our understanding of the underlying disease processes. However, rigorous computational analysis of biological systems, such as fim switch temperature control circuit, has hereto presented a notoriously demanding problem due to both the substantial complexity of the gene regulatory networks involved as well as their often characteristically discrete and stochastic dynamics. To address these issues, we have developed an approach that enables automated multiscale abstraction of biological system descriptions based on reaction kinetics. Implemented as a computational tool, this method has allowed us to efficiently analyze the modular organization and behavior of the E. coli fimbriation switch circuit at different temperature settings, thus facilitating new insights into this mode of UPEC molecular virulence regulation. In particular, our results suggest that, with respect to its role in shutting down fimbriae expression, the primary function of FimB recombinase may be to effect a controlled down-regulation (rather than increase of the ON-to-OFF fim switching rate via temperature-dependent suppression of competing dynamics mediated by recombinase FimE. Our computational analysis further implies

  18. Qualitative and quantitative combined nonlinear dynamics model and its application in analysis of price, supply–demand ratio and selling rate

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    The qualitative and quantitative combined nonlinear dynamics model proposed in this paper fill the gap in nonlinear dynamics model in terms of qualitative and quantitative combined methods, allowing the qualitative model and quantitative model to perfectly combine and overcome their weaknesses by learning from each other. These two types of models use their strengths to make up for the other’s deficiencies. The qualitative and quantitative combined models can surmount the weakness that the qualitative model cannot be applied and verified in a quantitative manner, and the high costs and long time of multiple construction as well as verification of the quantitative model. The combined model is more practical and efficient, which is of great significance for nonlinear dynamics. The qualitative and quantitative combined modeling and model analytical method raised in this paper is not only applied to nonlinear dynamics, but can be adopted and drawn on in the modeling and model analysis of other fields. Additionally, the analytical method of qualitative and quantitative combined nonlinear dynamics model proposed in this paper can satisfactorily resolve the problems with the price system’s existing nonlinear dynamics model analytical method. The three-dimensional dynamics model of price, supply–demand ratio and selling rate established in this paper make estimates about the best commodity prices using the model results, thereby providing a theoretical basis for the government’s macro-control of price. Meanwhile, this model also offer theoretical guidance to how to enhance people’s purchasing power and consumption levels through price regulation and hence to improve people’s living standards.

  19. Energy Tax versus Carbon Tax. A quantitative macro economical analysis with the HERMES/MIDAS models

    International Nuclear Information System (INIS)

    Karadeloglou, P.

    1992-01-01

    The idea of imposing a tax has been recently put forward as a policy-instrument to induce substitutions aiming at reducing CO[sub 2] overall emissions. One can distinguish two options: recycle tax revenues for energy system restructuring (supply or demand restructuring); or use the corresponding revenues in order to reduce the negative impacts caused on the economic activity by the introduction of the tax. Several papers dealing with only the macroeconomic aspects of the environmental problems have been written. These papers neglect more or less the energy sphere and consider that the energy feedback effects are very small. Macroeconomic impacts of the carbon tax have been examined for the United Kingdom and for the four big European countries elsewhere. In this paper a synthesis of both the energy and the macroeconomic approaches is realized. The approach adopted is global and tries to evaluate the impacts on both the economic and energy system. The main question examined is the effectiveness and impacts of fiscal policy on CO[sub 2] emission and the effects of the adoption of an accommodating policy. Thus, not only the effects of imposing an energy or carbon tax are examined, but also the effects of introducing accommodating measures are studied. The analysis is effected by using the HERMES-MIDAS linked system of models and is limited in analyzing the effects of carbon and energy taxes and the reduction of direct taxes and is effected for four countries namely France, Federal Republic of Germany, Italy and the United Kingdom. In section 2 policy scenarios are described while in sections three and four the results of the policy simulations are presented. In section five we compare the differences of two taxes (energy tax and carbon tax) and in section six the reduction of direct taxation as an accommodating measure is examined. 27 tabs., 10 refs

  20. Energy Tax versus Carbon Tax. A quantitative macro economical analysis with the HERMES/MIDAS models

    Energy Technology Data Exchange (ETDEWEB)

    Karadeloglou, P. [National Technical University of Athens (Greece)

    1992-03-01

    The idea of imposing a tax has been recently put forward as a policy-instrument to induce substitutions aiming at reducing CO{sub 2} overall emissions. One can distinguish two options: recycle tax revenues for energy system restructuring (supply or demand restructuring); or use the corresponding revenues in order to reduce the negative impacts caused on the economic activity by the introduction of the tax. Several papers dealing with only the macroeconomic aspects of the environmental problems have been written. These papers neglect more or less the energy sphere and consider that the energy feedback effects are very small. Macroeconomic impacts of the carbon tax have been examined for the United Kingdom and for the four big European countries elsewhere. In this paper a synthesis of both the energy and the macroeconomic approaches is realized. The approach adopted is global and tries to evaluate the impacts on both the economic and energy system. The main question examined is the effectiveness and impacts of fiscal policy on CO{sub 2} emission and the effects of the adoption of an accommodating policy. Thus, not only the effects of imposing an energy or carbon tax are examined, but also the effects of introducing accommodating measures are studied. The analysis is effected by using the HERMES-MIDAS linked system of models and is limited in analyzing the effects of carbon and energy taxes and the reduction of direct taxes and is effected for four countries namely France, Federal Republic of Germany, Italy and the United Kingdom. In section 2 policy scenarios are described while in sections three and four the results of the policy simulations are presented. In section five we compare the differences of two taxes (energy tax and carbon tax) and in section six the reduction of direct taxation as an accommodating measure is examined. 27 tabs., 10 refs.

  1. The Nucleation and Propagation of Thrust Ramps: Insights from Quantitative Analysis of Frictional Analog (Sandbox) Models

    Science.gov (United States)

    Sen, P.; Haq, S. S.; Marshak, S.

    2012-12-01

    Particle Imaging Velocimetry (PIV) provides a unique opportunity to analyze deformation in sandbox analog models at a scale that allows documentation of movement within and around individual shear structures. We employed PIV analysis to quantify deformation in sandbox experiments designed to simulate the initiation of thrust ramps developed during crustal shortening (i.e., contractional deformation). Our intent was to answer a long-standing question: Do ramps initiate at the tip of a detachment, or do they initiate in the interior of a deforming layer and propagate up-dip and down-dip until they link to the detachment at a location to the hinterland of the detachment's tip line? Most geometric studies of ramp-flat geometries in fold-thrust belts assume that ramps propagate up-dip from the tip of the detachment, and grow only in one direction. Field studies, in contrast, reveal that layer-parallel shortening structures develop to the foreland of the last ramp to form, suggesting that ramps initiate in a thrust sheet that has already undergone displacement above a detachment. Published sandbox models, using color-sand marker layers, support this idea. To test this idea further, we set up a model using a 3 m-long by 0.31-m wide glass-walled sandbox with a rigid backstop. The sand layer was sifted onto a sheet of mylar that could be pulled beneath the rigid backstop. Sand used in our experiments consisted of <250 μm-diameter grains. We carried out multiple runs using 4 cm, 5 cm and 6 cm-thick layers. Images were acquired over 1 mm displacement intervals using an 18 mega-pixel camera. By moving the camera at specific steps during the experiment, we sampled the development of several thrust ramps. The images taken during experimental runs were analyzed with a MATLAB-based program called 'PIV LAB' that utilizes an image cross-correlation subroutine to determine displacement fields of the sand particles. Our results demonstrate that: (1) thrust ramps initiate within the

  2. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Science.gov (United States)

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  3. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Directory of Open Access Journals (Sweden)

    Christelle Robinet

    Full Text Available Pest Risk Analyses (PRAs are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens. Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  4. Modeling grain boundaries in polycrystals using cohesive elements: Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    El Shawish, Samir; Cizelj, Leon; Simonovski, Igor

    2013-01-01

    Highlights: ► We estimate the performance of cohesive elements for modeling grain boundaries. ► We compare the computed stresses in ABAQUS finite element solver. ► Tests are performed in analytical and realistic models of polycrystals. ► Most severe issue is found within the plastic grain response. ► Other identified issues are related to topological constraints in modeling space. -- Abstract: We propose and demonstrate several tests to estimate the performance of the cohesive elements in ABAQUS for modeling grain boundaries in complex spatial structures such as polycrystalline aggregates. The performance of the cohesive elements is checked by comparing the computed stresses with the theoretically predicted values for a homogeneous material under uniaxial tensile loading. Statistical analyses are performed under different loading conditions for two elasto-plastic models of the grains: isotropic elasticity with isotropic hardening plasticity and anisotropic elasticity with crystal plasticity. Tests are conducted on an analytical finite element model generated from Voronoi tessellation as well as on a realistic finite element model of a stainless steel wire. The results of the analyses highlight several issues related to the computation of normal and shear stresses. The most severe issue is found within the plastic grain response where the computed normal stresses on a particularly oriented cohesive elements are significantly underestimated. Other issues are found to be related to topological constraints in the modeling space and result in the increased scatter of the computed stresses

  5. Quantitative biokinetic analysis of radioactively labelled, inhaled Titanium dioxide Nanoparticles in a rat model

    International Nuclear Information System (INIS)

    Kreyling, Wolfgang G.; Wenk, Alexander; Semmler-Behnke, Manuela

    2010-01-01

    The aim of this project was the determination of the biokinetics of TiO 2 nanoparticles (NP) in the whole body of healthy adult rats after NP administration to the respiratory tract - either via inhalation or instillation. We developed an own methodology to freshly synthesize and aerosolize TiO 2 -NP in our lab for the use of inhalation studies. These NP underwent a detailed physical and chemical characterization providing pure polycrystalline anatase TiO 2 -NP of about 20 nm (geometric standard deviation 1.6) and a specific surface area of 270 m 2 /g. In addition, we developed techniques for sufficiently stable radioactive 48 V labelling of the TiO 2 NP. The kinetics of solubility of 48 V was thoroughly determined. The methodology of quantitative biokinetics allows for a quantitative balance of the retained and excreted NP in control of the administered NP dose and provides a much more precise determination of NP fractions and concentrations of NP in organs and tissues of interest as compared to spotting biokinetics studies. Small fractions of TiO 2 -NP translocate across the air-blood-barrier and accumulate in secondary target organs, soft tissue and skeleton. The amount of translocated TiO 2 -NP is approximately 2% of TiO 2 -NP deposited in the lungs. A prominent fraction of these translocated TiO 2 -NP was found in the remainder. Smaller amounts of TiO 2 -NP accumulate in secondary organs following particular kinetics. TiO 2 -NP translocation was grossly accomplished within the first 2-4 hours after inhalation followed by retention in all organs and tissues studied without any detectable clearance of these biopersistent TiO 2 -NP within 28 days. Therefore, our data suggest crossing of the air-blood-barrier of the lungs and subsequent accumulation in secondary organs and tissues depends on the NP material and its physico-chemical properties. Furthermore, we extrapolate that during repeated or chronic exposure to insoluble NP the translocated fraction of NP will

  6. Quantitative biokinetic analysis of radioactively labelled, inhaled Titanium dioxide Nanoparticles in a rat model

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Wolfgang G.; Wenk, Alexander; Semmler-Behnke, Manuela [Helmholtz Zentrum Muenchen, Deutsches Forschungszentrum fuer Gesundheit und Umwelt GmbH (Germany). Inst. fuer Lungenbiologie und Erkrankungen, Netzwerk Nanopartikel und Gesundheit

    2010-09-15

    The aim of this project was the determination of the biokinetics of TiO{sub 2} nanoparticles (NP) in the whole body of healthy adult rats after NP administration to the respiratory tract - either via inhalation or instillation. We developed an own methodology to freshly synthesize and aerosolize TiO{sub 2}-NP in our lab for the use of inhalation studies. These NP underwent a detailed physical and chemical characterization providing pure polycrystalline anatase TiO{sub 2}-NP of about 20 nm (geometric standard deviation 1.6) and a specific surface area of 270 m{sup 2}/g. In addition, we developed techniques for sufficiently stable radioactive {sup 48}V labelling of the TiO{sub 2} NP. The kinetics of solubility of {sup 48}V was thoroughly determined. The methodology of quantitative biokinetics allows for a quantitative balance of the retained and excreted NP in control of the administered NP dose and provides a much more precise determination of NP fractions and concentrations of NP in organs and tissues of interest as compared to spotting biokinetics studies. Small fractions of TiO{sub 2}-NP translocate across the air-blood-barrier and accumulate in secondary target organs, soft tissue and skeleton. The amount of translocated TiO{sub 2}-NP is approximately 2% of TiO{sub 2}-NP deposited in the lungs. A prominent fraction of these translocated TiO{sub 2}-NP was found in the remainder. Smaller amounts of TiO{sub 2}-NP accumulate in secondary organs following particular kinetics. TiO{sub 2}-NP translocation was grossly accomplished within the first 2-4 hours after inhalation followed by retention in all organs and tissues studied without any detectable clearance of these biopersistent TiO{sub 2}-NP within 28 days. Therefore, our data suggest crossing of the air-blood-barrier of the lungs and subsequent accumulation in secondary organs and tissues depends on the NP material and its physico-chemical properties. Furthermore, we extrapolate that during repeated or chronic

  7. Mathematical model of gamma-ray spectrometry borehole logging for quantitative analysis

    Science.gov (United States)

    Schimschal, Ulrich

    1981-01-01

    A technique for analyzing gamma-ray spectral-logging data has been developed, in which a digital computer is used to calculate the effects of gamma-ray attentuation in a borehole environment. The computer model allows for the calculation of the effects of lithology, porosity, density, and the thickness of a horizontal layer of uniformly distributed radioactive material surrounding a centralized probe in a cylindrical borehole. The computer program also contains parameters for the calculation of the effects of well casing, drilling fluid, probe housing, and losses through the sodium-iodide crystal. Errors associated with the commonly used mathematical assumption of a point detector are eliminated in this model. (USGS)

  8. Quantitative analysis of anaerobic oxidation of methane (AOM) in marine sediments: a modeling perspective

    NARCIS (Netherlands)

    Regnier, P.; Dale, A.W.; Arndt, S.; LaRowe, D.E.; Mogollon, J.M.; Van Cappellen, P.

    2011-01-01

    Recent developments in the quantitativemodeling of methane dynamics and anaerobic oxidation of methane (AOM) in marine sediments are critically reviewed. The first part of the review begins with a comparison of alternative kinetic models for AOM. The roles of bioenergetic limitations, intermediate

  9. Quantitative analysis of large amounts of journalistic texts using topic modelling

    NARCIS (Netherlands)

    Jacobi, C.; van Atteveldt, W.H.; Welbers, K.

    2016-01-01

    The huge collections of news content which have become available through digital technologies both enable and warrant scientific inquiry, challenging journalism scholars to analyse unprecedented amounts of texts. We propose Latent Dirichlet Allocation (LDA) topic modelling as a tool to face this

  10. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  11. Chemometric models for the quantitative descriptive sensory analysis of Arabica coffee beverages using near infrared spectroscopy.

    Science.gov (United States)

    Ribeiro, J S; Ferreira, M M C; Salva, T J G

    2011-02-15

    Mathematical models based on chemometric analyses of the coffee beverage sensory data and NIR spectra of 51 Arabica roasted coffee samples were generated aiming to predict the scores of acidity, bitterness, flavour, cleanliness, body and overall quality of coffee beverage. Partial least squares (PLS) were used to construct the models. The ordered predictor selection (OPS) algorithm was applied to select the wavelengths for the regression model of each sensory attribute in order to take only significant regions into account. The regions of the spectrum defined as important for sensory quality were closely related to the NIR spectra of pure caffeine, trigonelline, 5-caffeoylquinic acid, cellulose, coffee lipids, sucrose and casein. The NIR analyses sustained that the relationship between the sensory characteristics of the beverage and the chemical composition of the roasted grain were as listed below: 1 - the lipids and proteins were closely related to the attribute body; 2 - the caffeine and chlorogenic acids were related to bitterness; 3 - the chlorogenic acids were related to acidity and flavour; 4 - the cleanliness and overall quality were related to caffeine, trigonelline, chlorogenic acid, polysaccharides, sucrose and protein. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Heat strain imposed by personal protective ensembles: quantitative analysis using a thermoregulation model

    Science.gov (United States)

    Xu, Xiaojiang; Gonzalez, Julio A.; Santee, William R.; Blanchard, Laurie A.; Hoyt, Reed W.

    2016-07-01

    The objective of this paper is to study the effects of personal protective equipment (PPE) and specific PPE layers, defined as thermal/evaporative resistances and the mass, on heat strain during physical activity. A stepwise thermal manikin testing and modeling approach was used to analyze a PPE ensemble with four layers: uniform, ballistic protection, chemical protective clothing, and mask and gloves. The PPE was tested on a thermal manikin, starting with the uniform, then adding an additional layer in each step. Wearing PPE increases the metabolic rates (dot{M}) , thus dot{M} were adjusted according to the mass of each of four configurations. A human thermoregulatory model was used to predict endurance time for each configuration at fixed dot{M} and at its mass adjusted dot{M} . Reductions in endurance time due to resistances, and due to mass, were separately determined using predicted results. Fractional contributions of PPE's thermal/evaporative resistances by layer show that the ballistic protection and the chemical protective clothing layers contribute about 20 %, respectively. Wearing the ballistic protection over the uniform reduced endurance time from 146 to 75 min, with 31 min of the decrement due to the additional resistances of the ballistic protection, and 40 min due to increased dot{M} associated with the additional mass. Effects of mass on heat strain are of a similar magnitude relative to effects of increased resistances. Reducing resistances and mass can both significantly alleviate heat strain.

  13. The Quantitative Analysis of User Behavior Online - Data, Models and Algorithms

    Science.gov (United States)

    Raghavan, Prabhakar

    By blending principles from mechanism design, algorithms, machine learning and massive distributed computing, the search industry has become good at optimizing monetization on sound scientific principles. This represents a successful and growing partnership between computer science and microeconomics. When it comes to understanding how online users respond to the content and experiences presented to them, we have more of a lacuna in the collaboration between computer science and certain social sciences. We will use a concrete technical example from image search results presentation, developing in the process some algorithmic and machine learning problems of interest in their own right. We then use this example to motivate the kinds of studies that need to grow between computer science and the social sciences; a critical element of this is the need to blend large-scale data analysis with smaller-scale eye-tracking and "individualized" lab studies.

  14. Quantitative analysis of impact-induced seismic signals by numerical modeling

    Science.gov (United States)

    Güldemeister, Nicole; Wünnemann, Kai

    2017-11-01

    We quantify the seismicity of impact events using a combined numerical and experimental approach. The objectives of this work are (1) the calibration of the numerical model by utilizing real-time measurements of the elastic wave velocity and pressure amplitudes in laboratory impact experiments; (2) the determination of seismic parameters, such as quality factor Q and seismic efficiency k, for materials of different porosity and water saturation by a systematic parameter study employing the calibrated numerical model. By means of "numerical experiments" we found that the seismic efficiency k decreases slightly with porosity from k = 3.4 × 10-3 for nonporous quartzite to k = 2.6 × 10-3 for 25% porous sandstone. If pores are completely or partly filled with water, we determined a seismic efficiency of k = 8.2 × 10-5, which is approximately two orders of magnitude lower than in the nonporous case. By measuring the attenuation of the seismic wave with distance in our numerical experiments we determined the seismic quality factor Q to range between ∼35 for the solid quartzite and 80 for the porous dry targets. For water saturated target materials, Q is much lower, <10. The obtained values are in the range of literature values. Translating the seismic efficiency into seismic magnitudes we show that the seismic magnitude of an impact event is about one order of magnitude smaller considering a water saturated target in comparison to a solid or porous target. Obtained seismic magnitudes decrease linearly with distance to the point of impact and are consistent with empirical data for distances closer to the point of impact. The seismic magnitude decreases more rapidly with distance for a water saturated material compared to a dry material.

  15. Improved vertical displacements induced by a refined thermal expansion model and its quantitative analysis in GPS height time series

    Science.gov (United States)

    Wang, Kaihua; Chen, Hua; Jiang, Weiping; Li, Zhao; Ma, Yifang; Deng, Liansheng

    2018-04-01

    There are apparent seasonal variations in GPS height time series, and thermal expansion is considered to be one of the potential geophysical contributors. The displacements introduced by thermal expansion are usually derived without considering the annex height and underground part of the monument (e.g. located on roof or top of the buildings), which may bias the geophysical explanation of the seasonal oscillation. In this paper, the improved vertical displacements are derived by a refined thermal expansion model where the annex height and underground depth of the monument are taken into account, and then 560 IGS stations are adopted to validate the modeled thermal expansion (MTE) displacements. In order to evaluate the impact of thermal expansion on GPS heights, the MTE displacements of 80 IGS stations with less data discontinuities are selected to compare with their observed GPS vertical (OGV) displacements with the modeled surface loading (MSL) displacements removed in advance. Quantitative analysis results show the maximum annual and semiannual amplitudes of the MTE are 6.65 mm (NOVJ) and 0.51 mm (IISC), respectively, and the maximum peak-to-peak oscillation of the MTE displacements can be 19.4 mm. The average annual amplitude reductions are 0.75 mm and 1.05 mm respectively after removing the MTE and MSL displacements from the OGV, indicating the seasonal oscillation induced by thermal expansion is equivalent to >75% of the impact of surface loadings. However, there are rarely significant reductions for the semiannual amplitude. Given the result in this study that thermal expansion can explain 17.3% of the annual amplitude in GPS heights on average, it must be precisely modeled both in GPS precise data processing and GPS time series analysis, especially for those stations located in the middle and high latitudes with larger annual temperature oscillation, or stations with higher monument.

  16. The TX-model - a quantitative heat loss analysis of district heating pipes by means of IR surface temperature measurements

    Energy Technology Data Exchange (ETDEWEB)

    Zinki, Heimo [ZW Energiteknik, Nykoeping (Sweden)

    1996-11-01

    The aim of this study was to investigate the possibility of analysing the temperature profile at the ground surface above buried district heating pipes in such a way that would enable the quantitative determination of heat loss from the pair of pipes. In practical applications, it is supposed that this temperature profile is generated by means of advanced IR-thermography. For this purpose, the principle of the TX - model has been developed, based on the fact that the heat losses from pipes buried in the ground have a temperature signature on the ground surface. Qualitative analysis of this temperature signature is very well known and in practical use for detecting leaks from pipes. These techniques primarily make use of relative changes of the temperature pattern along the pipe. In the quantitative heat loss analysis, however, it is presumed that the temperature profile across the pipes is related to the pipe heat loss per unit length. The basic idea is that the integral of the temperature profile perpendicular to the pipe, called TX, is a function of the heat loss, but is also affected by other parameters such as burial depth, heat diffusivity, wind, precipitation and so on. In order to analyse the parameters influencing the TX- factor, a simulation model for the energy balance at the ground surface has been developed. This model includes the heat flow from the pipe to the surface and the heat exchange at the surface with the environment due to convection, latent heat change, solar and long wave radiation. The simulation gives the surprising result that the TX factor is by and large unaffected during the course of a day even when the sun is shining, as long as other climate conditions are relatively stable (low wind, no rain, no shadows). The results from the simulations were verified at different sites in Denmark, Finland, Sweden and USA through a co-operative research program organised and partially financed by the IEA District Heating Programme, Task III, and

  17. Quantitative analysis of porcine reproductive and respiratory syndrome (PRRS) viremia profiles from experimental infection: a statistical modelling approach.

    Science.gov (United States)

    Islam, Zeenath U; Bishop, Stephen C; Savill, Nicholas J; Rowland, Raymond R R; Lunney, Joan K; Trible, Benjamin; Doeschl-Wilson, Andrea B

    2013-01-01

    Porcine reproductive and respiratory syndrome (PRRS) is one of the most economically significant viral diseases facing the global swine industry. Viremia profiles of PRRS virus challenged pigs reflect the severity and progression of infection within the host and provide crucial information for subsequent control measures. In this study we analyse the largest longitudinal PRRS viremia dataset from an in-vivo experiment. The primary objective was to provide a suitable mathematical description of all viremia profiles with biologically meaningful parameters for quantitative analysis of profile characteristics. The Wood's function, a gamma-type function, and a biphasic extended Wood's function were fit to the individual profiles using Bayesian inference with a likelihood framework. Using maximum likelihood inference and numerous fit criteria, we established that the broad spectrum of viremia trends could be adequately represented by either uni- or biphasic Wood's functions. Three viremic categories emerged: cleared (uni-modal and below detection within 42 days post infection(dpi)), persistent (transient experimental persistence over 42 dpi) and rebound (biphasic within 42 dpi). The convenient biological interpretation of the model parameters estimates, allowed us not only to quantify inter-host variation, but also to establish common viremia curve characteristics and their predictability. Statistical analysis of the profile characteristics revealed that persistent profiles were distinguishable already within the first 21 dpi, whereas it is not possible to predict the onset of viremia rebound. Analysis of the neutralizing antibody(nAb) data indicated that there was a ubiquitous strong response to the homologous PRRSV challenge, but high variability in the range of cross-protection of the nAbs. Persistent pigs were found to have a significantly higher nAb cross-protectivity than pigs that either cleared viremia or experienced rebound within 42 dpi. Our study provides

  18. Summarization vs Peptide-Based Models in Label-Free Quantitative Proteomics: Performance, Pitfalls, and Data Analysis Guidelines.

    Science.gov (United States)

    Goeminne, Ludger J E; Argentini, Andrea; Martens, Lennart; Clement, Lieven

    2015-06-05

    Quantitative label-free mass spectrometry is increasingly used to analyze the proteomes of complex biological samples. However, the choice of appropriate data analysis methods remains a major challenge. We therefore provide a rigorous comparison between peptide-based models and peptide-summarization-based pipelines. We show that peptide-based models outperform summarization-based pipelines in terms of sensitivity, specificity, accuracy, and precision. We also demonstrate that the predefined FDR cutoffs for the detection of differentially regulated proteins can become problematic when differentially expressed (DE) proteins are highly abundant in one or more samples. Care should therefore be taken when data are interpreted from samples with spiked-in internal controls and from samples that contain a few very highly abundant proteins. We do, however, show that specific diagnostic plots can be used for assessing differentially expressed proteins and the overall quality of the obtained fold change estimates. Finally, our study also illustrates that imputation under the "missing by low abundance" assumption is beneficial for the detection of differential expression in proteins with low abundance, but it negatively affects moderately to highly abundant proteins. Hence, imputation strategies that are commonly implemented in standard proteomics software should be used with care.

  19. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  20. Origin of migmatites by deformation-enhanced melt infiltration of orthogneiss: a new model based on quantitative microstructural analysis

    Czech Academy of Sciences Publication Activity Database

    Hasalová, P.; Schulmann, K.; Lexa, O.; Štípská, P.; Hrouda, F.; Ulrich, Stanislav; Haloda, J.; Týcová, P.

    2008-01-01

    Roč. 26, č. 1 (2008), s. 29-53 ISSN 0263-4929 Grant - others:GA AV ČR(CZ) IAA3111401; GA ČR(CZ) GA205/04/2065 Institutional research plan: CEZ:AV0Z30120515 Keywords : crystal size distribution * melt infiltration * melt topology * migmatites * quantitative textural analysis Subject RIV: DB - Geology ; Mineralogy Impact factor: 3.340, year: 2008

  1. Defect evolution in cosmology and condensed matter quantitative analysis with the velocity-dependent one-scale model

    CERN Document Server

    Martins, C J A P

    2016-01-01

    This book sheds new light on topological defects in widely differing systems, using the Velocity-Dependent One-Scale Model to better understand their evolution. Topological defects – cosmic strings, monopoles, domain walls or others - necessarily form at cosmological (and condensed matter) phase transitions. If they are stable and long-lived they will be fossil relics of higher-energy physics. Understanding their behaviour and consequences is a key part of any serious attempt to understand the universe, and this requires modelling their evolution. The velocity-dependent one-scale model is the only fully quantitative model of defect network evolution, and the canonical model in the field. This book provides a review of the model, explaining its physical content and describing its broad range of applicability.

  2. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  3. Quantitative spectroscopy of extreme helium stars Model atmospheres and a non-LTE abundance analysis of BD+10°2179

    Science.gov (United States)

    Kupfer, T.; Przybilla, N.; Heber, U.; Jeffery, C. S.; Behara, N. T.; Butler, K.

    2017-10-01

    Extreme helium stars (EHe stars) are hydrogen-deficient supergiants of spectral type A and B. They are believed to result from mergers in double degenerate systems. In this paper, we present a detailed quantitative non-LTE spectral analysis for BD+10°2179, a prototype of this rare class of stars, using UV-Visual Echelle Spectrograph and Fiber-fed Extended Range Optical Spectrograph spectra covering the range from ˜3100 to 10 000 Å. Atmosphere model computations were improved in two ways. First, since the UV metal line blanketing has a strong impact on the temperature-density stratification, we used the atlas12 code. Additionally, We tested atlas12 against the benchmark code sterne3, and found only small differences in the temperature and density stratifications, and good agreement with the spectral energy distributions. Secondly, 12 chemical species were treated in non-LTE. Pronounced non-LTE effects occur in individual spectral lines but, for the majority, the effects are moderate to small. The spectroscopic parameters give Teff =17 300±300 K and log g = 2.80±0.10, and an evolutionary mass of 0.55±0.05 M⊙. The star is thus slightly hotter, more compact and less massive than found in previous studies. The kinematic properties imply a thick-disc membership, which is consistent with the metallicity [Fe/H] ≈ -1 and α-enhancement. The refined light-element abundances are consistent with the white dwarf merger scenario. We further discuss the observed helium spectrum in an appendix, detecting dipole-allowed transitions from about 150 multiplets plus the most comprehensive set of known/predicted isolated forbidden components to date. Moreover, a so far unreported series of pronounced forbidden He I components is detected in the optical-UV.

  4. Stepwise sensitivity analysis from qualitative to quantitative: Application to the terrestrial hydrological modeling of a Conjunctive Surface-Subsurface Process (CSSP) land surface model

    Science.gov (United States)

    Gan, Yanjun; Liang, Xin-Zhong; Duan, Qingyun; Choi, Hyun Il; Dai, Yongjiu; Wu, Huan

    2015-06-01

    An uncertainty quantification framework was employed to examine the sensitivities of 24 model parameters from a newly developed Conjunctive Surface-Subsurface Process (CSSP) land surface model (LSM). The sensitivity analysis (SA) was performed over 18 representative watersheds in the contiguous United States to examine the influence of model parameters in the simulation of terrestrial hydrological processes. Two normalized metrics, relative bias (RB) and Nash-Sutcliffe efficiency (NSE), were adopted to assess the fit between simulated and observed streamflow discharge (SD) and evapotranspiration (ET) for a 14 year period. SA was conducted using a multiobjective two-stage approach, in which the first stage was a qualitative SA using the Latin Hypercube-based One-At-a-Time (LH-OAT) screening, and the second stage was a quantitative SA using the Multivariate Adaptive Regression Splines (MARS)-based Sobol' sensitivity indices. This approach combines the merits of qualitative and quantitative global SA methods, and is effective and efficient for understanding and simplifying large, complex system models. Ten of the 24 parameters were identified as important across different watersheds. The contribution of each parameter to the total response variance was then quantified by Sobol' sensitivity indices. Generally, parameter interactions contribute the most to the response variance of the CSSP, and only 5 out of 24 parameters dominate model behavior. Four photosynthetic and respiratory parameters are shown to be influential to ET, whereas reference depth for saturated hydraulic conductivity is the most influential parameter for SD in most watersheds. Parameter sensitivity patterns mainly depend on hydroclimatic regime, as well as vegetation type and soil texture. This article was corrected on 26 JUN 2015. See the end of the full text for details.

  5. Quantitative analysis of surface deformation and ductile flow in complex analogue geodynamic models based on PIV method.

    Science.gov (United States)

    Krýza, Ondřej; Lexa, Ondrej; Závada, Prokop; Schulmann, Karel; Gapais, Denis; Cosgrove, John

    2017-04-01

    Recently, a PIV (particle image velocimetry) analysis method is optical method abundantly used in many technical branches where material flow visualization and quantification is important. Typical examples are studies of liquid flow through complex channel system, gas spreading or combustion problematics. In our current research we used this method for investigation of two types of complex analogue geodynamic and tectonic experiments. First class of experiments is aimed to model large-scale oroclinal buckling as an analogue of late Paleozoic to early Mesozoic evolution of Central Asian Orogenic Belt (CAOB) resulting from nortward drift of the North-China craton towards the Siberian craton. Here we studied relationship between lower crustal and lithospheric mantle flows and upper crustal deformation respectively. A second class of experiments is focused to more general study of a lower crustal flow in indentation systems that represent a major component of some large hot orogens (e.g. Bohemian massif). The most of simulations in both cases shows a strong dependency of a brittle structures shape, that are situated in upper crust, on folding style of a middle and lower ductile layers which is influenced by rheological, geometrical and thermal conditions of different parts across shortened domain. The purpose of PIV application is to quantify material redistribution in critical domains of the model. The derivation of flow direction and calculation of strain-rate and total displacement field in analogue experiments is generally difficult and time-expensive or often performed only on a base of visual evaluations. PIV method operates with set of images, where small tracer particles are seeded within modeled domain and are assumed to faithfully follow the material flow. On base of pixel coordinates estimation the material displacement field, velocity field, strain-rate, vorticity, tortuosity etc. are calculated. In our experiments we used velocity field divergence to

  6. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  7. Is the Linear Modeling Technique Good Enough for Optimal Form Design? A Comparison of Quantitative Analysis Models

    Science.gov (United States)

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961

  8. Is the Linear Modeling Technique Good Enough for Optimal Form Design? A Comparison of Quantitative Analysis Models

    Directory of Open Access Journals (Sweden)

    Yang-Cheng Lin

    2012-01-01

    Full Text Available How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers’ perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique, and neural networks (the nonlinear modeling technique to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers’ perception of product image and product form elements of personal digital assistants (PDAs. The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.

  9. Building a Database for a Quantitative Model

    Science.gov (United States)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  10. Quantitative 2- and 3-dimensional analysis of pharmacokinetic model-derived variables for breast lesions in dynamic, contrast-enhanced MR mammography

    International Nuclear Information System (INIS)

    Hauth, E.A.M.; Jaeger, H.J.; Maderwald, S.; Muehler, A.; Kimmig, R.; Forsting, M.

    2008-01-01

    Purpose: 2- and 3-dimensional evaluation of quantitative pharmacokinetic parameters derived from the Tofts model modeling dynamic contrast enhancement of lesions in MR mammography. Materials and methods: In 95 patients, MR mammography revealed 127 suspicious lesions. The initial rate of enhancement was coded by color intensity, the post-initial enhancement change is coded by color hue. 2D and 3D analysis of distribution of color hue and intensity, vascular permeability and extracellular volume were performed. Results: In 2D, malignant lesions showed significant higher number of bright red, medium red, dark red, bright green, medium green, dark green and bright blue pixels than benign lesions. In 3D, statistical significant differences between malignant and benign lesions was found for all this parameters. Vascular permeability was significant higher in malignant lesions than in benign lesions. Regression model using the 3D data found that the best discriminator between malignant and benign lesions was combined number of voxels and medium green pixels, with a sensitivity of 79.4% and a specificity of 83.1%. Conclusions: Quantitative analysis of pharmacokinetic variables of contrast kinetics showed significant differences between malignant and benign lesions. 3D analysis showed superior diagnostic differentiation between malignant and benign lesions than 2D analysis. The parametric analysis using a pharmacokinetic model allows objective analysis of contrast enhancement in breast lesions

  11. Inhibition of bacterial conjugation by phage M13 and its protein g3p: quantitative analysis and model.

    Science.gov (United States)

    Lin, Abraham; Jimenez, Jose; Derr, Julien; Vera, Pedro; Manapat, Michael L; Esvelt, Kevin M; Villanueva, Laura; Liu, David R; Chen, Irene A

    2011-01-01

    Conjugation is the main mode of horizontal gene transfer that spreads antibiotic resistance among bacteria. Strategies for inhibiting conjugation may be useful for preserving the effectiveness of antibiotics and preventing the emergence of bacterial strains with multiple resistances. Filamentous bacteriophages were first observed to inhibit conjugation several decades ago. Here we investigate the mechanism of inhibition and find that the primary effect on conjugation is occlusion of the conjugative pilus by phage particles. This interaction is mediated primarily by phage coat protein g3p, and exogenous addition of the soluble fragment of g3p inhibited conjugation at low nanomolar concentrations. Our data are quantitatively consistent with a simple model in which association between the pili and phage particles or g3p prevents transmission of an F plasmid encoding tetracycline resistance. We also observe a decrease in the donor ability of infected cells, which is quantitatively consistent with a reduction in pili elaboration. Since many antibiotic-resistance factors confer susceptibility to phage infection through expression of conjugative pili (the receptor for filamentous phage), these results suggest that phage may be a source of soluble proteins that slow the spread of antibiotic resistance genes.

  12. Inhibition of bacterial conjugation by phage M13 and its protein g3p: quantitative analysis and model.

    Directory of Open Access Journals (Sweden)

    Abraham Lin

    Full Text Available Conjugation is the main mode of horizontal gene transfer that spreads antibiotic resistance among bacteria. Strategies for inhibiting conjugation may be useful for preserving the effectiveness of antibiotics and preventing the emergence of bacterial strains with multiple resistances. Filamentous bacteriophages were first observed to inhibit conjugation several decades ago. Here we investigate the mechanism of inhibition and find that the primary effect on conjugation is occlusion of the conjugative pilus by phage particles. This interaction is mediated primarily by phage coat protein g3p, and exogenous addition of the soluble fragment of g3p inhibited conjugation at low nanomolar concentrations. Our data are quantitatively consistent with a simple model in which association between the pili and phage particles or g3p prevents transmission of an F plasmid encoding tetracycline resistance. We also observe a decrease in the donor ability of infected cells, which is quantitatively consistent with a reduction in pili elaboration. Since many antibiotic-resistance factors confer susceptibility to phage infection through expression of conjugative pili (the receptor for filamentous phage, these results suggest that phage may be a source of soluble proteins that slow the spread of antibiotic resistance genes.

  13. Qualitative and simultaneous quantitative analysis of cimetidine polymorphs by ultraviolet-visible and shortwave near-infrared diffuse reflectance spectroscopy and multivariate calibration models.

    Science.gov (United States)

    Feng, Yuyan; Li, Xiangling; Xu, Kailin; Zou, Huayu; Li, Hui; Liang, Bing

    2015-02-01

    The object of the present study was to investigate the feasibility of applying ultraviolet-visible and shortwave near-infrared diffuse reflectance spectroscopy (UV-vis-SWNIR DRS) coupled with chemometrics in qualitative and simultaneous quantitative analysis of drug polymorphs, using cimetidine as a model drug. Three polymorphic forms (A, B and D) and a mixed crystal (M1) of cimetidine, obtained by preparation under different crystallization conditions, were characterized by microscopy, X-ray powder diffraction (XRPD) and infrared spectroscopy (IR). The discriminant models of four forms (A, B, D and M1) were established by discriminant partial least squares (PLS-DA) using different pretreated spectra. The R and RMSEP of samples in the prediction set by discriminant model with original spectra were 0.9959 and 0.1004. Among the quantitative models of binary mixtures (A and D) established by partial least squares (PLS) and least squares-support vector machine (LS-SVM) with different pretreated spectra, the LS-SVM models based on original and MSC spectra had better prediction effect with a R of 1.0000 and a RMSEP of 0.0134 for form A, and a R of 1.0000 and a RMSEP of 0.0024 for form D. For ternary mixtures, the established PLS quantitative models based on normalized spectra had relatively better prediction effect for forms A, B and D with R of 0.9901, 0.9820 and 0.9794 and RMSEP of 0.0471, 0.0529 and 0.0594, respectively. This research indicated that UV-vis-SWNIR DRS can be used as a simple, rapid, nondestructive qualitative and quantitative method for the analysis of drug polymorphs. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Quantitative study of Portland cement hydration by X-Ray diffraction/Rietveld analysis and geochemical modeling

    Science.gov (United States)

    Coutelot, F.; Seaman, J. C.; Simner, S.

    2017-12-01

    In this study the hydration of Portland cements containing blast-furnace slag and type V fly ash were investigated during cement curing using X-ray diffraction, with geochemical modeling used to calculate the total volume of hydrates. The goal was to evaluate the relationship between the starting component levels and the hydrate assemblages that develop during the curing process. Blast furnace-slag levels of 60, 45 and 30 wt.% were studied in blends containing fly ash and Portland cement. Geochemical modelling described the dissolution of the clinker, and predicted quantitatively the amount of hydrates. In all cases the experiments showed the presence of C-S-H, portlandite and ettringite. The quantities of ettringite, portlandite and the amorphous phases as determined by XRD agreed well with the calculated amounts of these phases after different periods of time. These findings show that changes in the bulk composition of hydrating cements can be described by geochemical models. Such a comparison between experimental and modelled data helps to understand in more detail the active processes occurring during cement hydration.

  15. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  16. Efficacy of the Frame and Hu mathematical model for the quantitative analysis of agents influencing growth of chick embryo fibroblasts

    International Nuclear Information System (INIS)

    Korohoda, K.; Czyz, J.

    1994-01-01

    The experiments on the effect of various sera and substratum surface area upon growth of chick embryo fibroblasts-like in secondary cultures are described and discussed on the grounds of a mathematical model for growth of anchorage-dependent cells proposed by Frame and Hu. The model and presented results demonstrate the mutual independence of the effects of agent influencing of rate of cell proliferation (i.e. accelerating or retarding growth) and the agents that modify the limitation of cell proliferation (i.e. maximum cell density at confluence). The model proposed by Frame and Hu due to its relative simplicity offers and easy mode of description and quantitative evaluation of experiments concerning cell growth regulation. It is shown that various sera added at constant concentration significantly modify the rate of cell proliferation with little effect upon the maximum cell density attainable. The cells grew much more slowly in the presence of calf serum than in the presence of chick serum and the addition of iron and zinc complexes to calf serum significantly accelerated cell growth. An increase in the substratum surface area by the addition of glass wool to culture vessels significantly increased cell density per constant volume of medium even when retardation of growth was observed. The results presented point to the need of direct cell counting for estimation of cell growth curves and discussion of effects of agents influencing parameters characterizing cell proliferation. (author). 34 refs, 5 figs, 2 tabs

  17. Reduced fractal model for quantitative analysis of averaged micromotions in mesoscale: Characterization of blow-like signals

    International Nuclear Information System (INIS)

    Nigmatullin, Raoul R.; Toboev, Vyacheslav A.; Lino, Paolo; Maione, Guido

    2015-01-01

    Highlights: •A new approach describes fractal-branched systems with long-range fluctuations. •A reduced fractal model is proposed. •The approach is used to characterize blow-like signals. •The approach is tested on data from different fields. -- Abstract: It has been shown that many micromotions in the mesoscale region are averaged in accordance with their self-similar (geometrical/dynamical) structure. This distinctive feature helps to reduce a wide set of different micromotions describing relaxation/exchange processes to an averaged collective motion, expressed mathematically in a rather general form. This reduction opens new perspectives in description of different blow-like signals (BLS) in many complex systems. The main characteristic of these signals is a finite duration also when the generalized reduced function is used for their quantitative fitting. As an example, we describe quantitatively available signals that are generated by bronchial asthmatic people, songs by queen bees, and car engine valves operating in the idling regime. We develop a special treatment procedure based on the eigen-coordinates (ECs) method that allows to justify the generalized reduced fractal model (RFM) for description of BLS that can propagate in different complex systems. The obtained describing function is based on the self-similar properties of the different considered micromotions. This kind of cooperative model is proposed here for the first time. In spite of the fact that the nature of the dynamic processes that take place in fractal structure on a mesoscale level is not well understood, the parameters of the RFM fitting function can be used for construction of calibration curves, affected by various external/random factors. Then, the calculated set of the fitting parameters of these calibration curves can characterize BLS of different complex systems affected by those factors. Though the method to construct and analyze the calibration curves goes beyond the scope

  18. Quantitative wear particle analysis for osteoarthritis assessment.

    Science.gov (United States)

    Guo, Meizhai; Lord, Megan S; Peng, Zhongxiao

    2017-12-01

    Osteoarthritis is a degenerative joint disease that affects millions of people worldwide. The aims of this study were (1) to quantitatively characterise the boundary and surface features of wear particles present in the synovial fluid of patients, (2) to select key numerical parameters that describe distinctive particle features and enable osteoarthritis assessment and (3) to develop a model to assess osteoarthritis conditions using comprehensive wear debris information. Discriminant analysis was used to statistically group particles based on differences in their numerical parameters. The analysis methods agreed with the clinical osteoarthritis grades in 63%, 50% and 61% of particles for no osteoarthritis, mild osteoarthritis and severe osteoarthritis, respectively. This study has revealed particle features specific to different osteoarthritis grades and provided further understanding of the cartilage degradation process through wear particle analysis - the technique that has the potential to be developed as an objective and minimally invasive method for osteoarthritis diagnosis.

  19. Comparison of longitudinal excursion of a nerve-phantom model using quantitative ultrasound imaging and motion analysis system methods: A convergent validity study.

    Science.gov (United States)

    Paquette, Philippe; El Khamlichi, Youssef; Lamontagne, Martin; Higgins, Johanne; Gagnon, Dany H

    2017-08-01

    Quantitative ultrasound imaging is gaining popularity in research and clinical settings to measure the neuromechanical properties of the peripheral nerves such as their capability to glide in response to body segment movement. Increasing evidence suggests that impaired median nerve longitudinal excursion is associated with carpal tunnel syndrome. To date, psychometric properties of longitudinal nerve excursion measurements using quantitative ultrasound imaging have not been extensively investigated. This study investigates the convergent validity of the longitudinal nerve excursion by comparing measures obtained using quantitative ultrasound imaging with those determined with a motion analysis system. A 38-cm long rigid nerve-phantom model was used to assess the longitudinal excursion in a laboratory environment. The nerve-phantom model, immersed in a 20-cm deep container filled with a gelatin-based solution, was moved 20 times using a linear forward and backward motion. Three light-emitting diodes were used to record nerve-phantom excursion with a motion analysis system, while a 5-cm linear transducer allowed simultaneous recording via ultrasound imaging. Both measurement techniques yielded excellent association ( r  = 0.99) and agreement (mean absolute difference between methods = 0.85 mm; mean relative difference between methods = 7.48 %). Small discrepancies were largely found when larger excursions (i.e. > 10 mm) were performed, revealing slight underestimation of the excursion by the ultrasound imaging analysis software. Quantitative ultrasound imaging is an accurate method to assess the longitudinal excursion of an in vitro nerve-phantom model and appears relevant for future research protocols investigating the neuromechanical properties of the peripheral nerves.

  20. Quantitative Analysis of Mixtures of Monoprotic Acids Applying Modified Model-Based Rank Annihilation Factor Analysis on Variation Matrices of Spectrophotometric Acid-Base Titrations

    Directory of Open Access Journals (Sweden)

    Ebrahim Ghorbani-Kalhor

    2015-04-01

    Full Text Available In the current work, a new version of rank annihilation factor analysis was developedto circumvent the rank deficiency problem in multivariate data measurements.Simultaneous determination of dissociation constant and concentration of monoprotic acids was performed by applying model-based rank annihilation factor analysis on variation matrices of spectrophotometric acid-base titrations data. Variation matrices can be obtained by subtracting first row of data matrix from all rows of the main data matrix. This method uses variation matrices instead of multivariate spectrophotometric acid-base titrations matrices to circumvent the rank deficiency problem in the rank quantitation step. The applicability of this approach was evaluated by simulated data at first stage, then the binary mixtures of ascorbic and sorbic acids as model compounds were investigated by the proposed method. At the end, the proposed method was successfully applied for resolving the ascorbic and sorbic acid in an orange juice real sample. Therefore, unique results were achieved by applying rank annihilation factor analysis on variation matrix and using hard soft model combination advantage without any problem and difficulty in rank determination. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi; mso-bidi-language:AR-SA;}    

  1. Quantitative interactome analysis reveals a chemoresistant edgotype.

    Science.gov (United States)

    Chavez, Juan D; Schweppe, Devin K; Eng, Jimmy K; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E

    2015-08-03

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for 'edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype.

  2. From qualitative data to quantitative models: analysis of the phage shock protein stress response in Escherichia coli

    Directory of Open Access Journals (Sweden)

    Jovanovic Goran

    2011-05-01

    Full Text Available Abstract Background Bacteria have evolved a rich set of mechanisms for sensing and adapting to adverse conditions in their environment. These are crucial for their survival, which requires them to react to extracellular stresses such as heat shock, ethanol treatment or phage infection. Here we focus on studying the phage shock protein (Psp stress response in Escherichia coli induced by a phage infection or other damage to the bacterial membrane. This system has not yet been theoretically modelled or analysed in silico. Results We develop a model of the Psp response system, and illustrate how such models can be constructed and analyzed in light of available sparse and qualitative information in order to generate novel biological hypotheses about their dynamical behaviour. We analyze this model using tools from Petri-net theory and study its dynamical range that is consistent with currently available knowledge by conditioning model parameters on the available data in an approximate Bayesian computation (ABC framework. Within this ABC approach we analyze stochastic and deterministic dynamics. This analysis allows us to identify different types of behaviour and these mechanistic insights can in turn be used to design new, more detailed and time-resolved experiments. Conclusions We have developed the first mechanistic model of the Psp response in E. coli. This model allows us to predict the possible qualitative stochastic and deterministic dynamic behaviours of key molecular players in the stress response. Our inferential approach can be applied to stress response and signalling systems more generally: in the ABC framework we can condition mathematical models on qualitative data in order to delimit e.g. parameter ranges or the qualitative system dynamics in light of available end-point or qualitative information.

  3. Automatic quantitative morphological analysis of interacting galaxies

    Science.gov (United States)

    Shamir, Lior; Holincheck, Anthony; Wallin, John

    2013-08-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, the morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the galaxy mergers are visualized using phylogenies to provide a graph that reflects the morphological similarities between the different galaxy mergers, and thus quantitatively profile the morphology of galaxy mergers.

  4. Quantitative structure-activity relationship and classification analysis of diaryl ureas against vascular endothelial growth factor receptor-2 kinase using linear and non-linear models.

    Science.gov (United States)

    Sun, Min; Chen, Junqing; Wei, Hongtao; Yin, Shuangqing; Yang, Yan; Ji, Min

    2009-06-01

    Quantitative structure-activity relationship analysis has been carried out for 74 diaryl ureas including aminobenzoisoxazole ureas, aminoindazole ureas, aminopyrazolopyridine ureas against vascular endothelial growth factor receptor-2 kinase using both linear and non-linear models. Considering simplicity and predictivity, multivariate linear regression was first employed in combination with various variable selection methods, including forward selection, genetic algorithm and enhanced replacement method based on descriptors generated by e-dragon software. Another model using support vector regression has also been constructed and compared. Performances of these models are rigorously validated by leave-one-out cross-validation, fivefold cross-validation and external validation. The enhanced replacement method model significantly outperforms the others with R(2) = 0.813 and R(2)(pred) = 0.809. Robustness and predictive ability of this model is prudently evaluated. Moreover, to find out the most significant features associated with the difference between highly active compounds and moderate ones, two classification models using linear discriminant analysis and support vector machine were further developed. The performance of support vector machine significantly outperforms linear discriminant analysis, with leave-one-out cross-validation and external validation prediction accuracy reaching 0.838 and 0.857, respectively. The resulting models could act as an efficient strategy for estimating the vascular endothelial growth factor receptor-2 inhibiting activity of novel diaryl ureas and provide some insights into the structural features related to the biological activity of these compounds.

  5. Development of a Model Protein Interaction Pair as a Benchmarking Tool for the Quantitative Analysis of 2-Site Protein-Protein Interactions.

    Science.gov (United States)

    Yamniuk, Aaron P; Newitt, John A; Doyle, Michael L; Arisaka, Fumio; Giannetti, Anthony M; Hensley, Preston; Myszka, David G; Schwarz, Fred P; Thomson, James A; Eisenstein, Edward

    2015-12-01

    A significant challenge in the molecular interaction field is to accurately determine the stoichiometry and stepwise binding affinity constants for macromolecules having >1 binding site. The mission of the Molecular Interactions Research Group (MIRG) of the Association of Biomolecular Resource Facilities (ABRF) is to show how biophysical technologies are used to quantitatively characterize molecular interactions, and to educate the ABRF members and scientific community on the utility and limitations of core technologies [such as biosensor, microcalorimetry, or analytic ultracentrifugation (AUC)]. In the present work, the MIRG has developed a robust model protein interaction pair consisting of a bivalent variant of the Bacillus amyloliquefaciens extracellular RNase barnase and a variant of its natural monovalent intracellular inhibitor protein barstar. It is demonstrated that this system can serve as a benchmarking tool for the quantitative analysis of 2-site protein-protein interactions. The protein interaction pair enables determination of precise binding constants for the barstar protein binding to 2 distinct sites on the bivalent barnase binding partner (termed binase), where the 2 binding sites were engineered to possess affinities that differed by 2 orders of magnitude. Multiple MIRG laboratories characterized the interaction using isothermal titration calorimetry (ITC), AUC, and surface plasmon resonance (SPR) methods to evaluate the feasibility of the system as a benchmarking model. Although general agreement was seen for the binding constants measured using solution-based ITC and AUC approaches, weaker affinity was seen for surface-based method SPR, with protein immobilization likely affecting affinity. An analysis of the results from multiple MIRG laboratories suggests that the bivalent barnase-barstar system is a suitable model for benchmarking new approaches for the quantitative characterization of complex biomolecular interactions.

  6. Quantitative structure - mesothelioma potency model ...

    Science.gov (United States)

    Cancer potencies of mineral and synthetic elongated particle (EP) mixtures, including asbestos fibers, are influenced by changes in fiber dose composition, bioavailability, and biodurability in combination with relevant cytotoxic dose-response relationships. A unique and comprehensive rat intra-pleural (IP) dose characterization data set with a wide variety of EP size, shape, crystallographic, chemical, and bio-durability properties facilitated extensive statistical analyses of 50 rat IP exposure test results for evaluation of alternative dose pleural mesothelioma response models. Utilizing logistic regression, maximum likelihood evaluations of thousands of alternative dose metrics based on hundreds of individual EP dimensional variations within each test sample, four major findings emerged: (1) data for simulations of short-term EP dose changes in vivo (mild acid leaching) provide superior predictions of tumor incidence compared to non-acid leached data; (2) sum of the EP surface areas (ÓSA) from these mildly acid-leached samples provides the optimum holistic dose response model; (3) progressive removal of dose associated with very short and/or thin EPs significantly degrades resultant ÓEP or ÓSA dose-based predictive model fits, as judged by Akaike’s Information Criterion (AIC); and (4) alternative, biologically plausible model adjustments provide evidence for reduced potency of EPs with length/width (aspect) ratios 80 µm. Regar

  7. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  8. Histological and MR quantitative analysis of repaired tissue following microfracture treatment for knee joint osteochondritis dissecans in rabbit models

    International Nuclear Information System (INIS)

    Tao Hongyue; Chen Shuang; Feng Xiaoyuan; Wang Zhan; Li Hong; Hua Yinghui; Chen Zhongqing

    2013-01-01

    Objective: To quantitatively analyze the histological and MR images of repaired tissue (RT) following microfracture for knee joint osteochondritis dissecans (OCD) in rabbit models at different time points, make comparisons with the RT performances of joint debridement, explore the efficiency of the microfracture treatment for OCD. Methods: Twenty-seven New Zealand rabbits were randomly assigned into 3 groups (sacrificed at the end of 3, 5 and 7 weeks post-operation respectively), with 9 in each group. For each rabbit, one knee joint was made into an OCD model. In each group, 6 were for microfracture treatment, and the other 3 were for joint debridement as control. MR scan, which mainly included sequences of 3D double echo steady state sequence (3D-DESS) and T 2 -mapping, was taken at 3, 5 and 7 weeks postoperation. The thickness index and T 2 value index of RT were calculated and T 2 -mapping of repaired region was drafted. Then the operation sites were removed to make histological sections of HE and Masson staining. The modified O'Driscoll score system was employed to make semi-quantitative evaluation for the histological performance of RT. Comparisons were made with respect to MR and histological findings between two treatments at each time point using unpaired Student t test. Effects of two treatments were evaluated longitudinally by comparing the results of three time points using one-way ANOVA. Results: The post-operation thickness indexes of two groups increased gradually (F = 33.940, 28.841, P < 0.05), T 2 value indexes decreased (F = 80.183, 206.206, P < 0.05), and O'driscoll scores increased gradually (F = 29.867, 17.167, P < 0.05). At each time point, the thickness index of microfracture was higher than that of debridement group (3-week: 0.743 ± 0.048 vs 0.624 ± 0.013, t = 4.077; 5-week: 0.813 ± 0.031 vs 0.734 ± 0.015, t = 4.107; 7-week: 0.972 ± 0.064 vs 0.777 ± 0.039, t = 4.782; P < 0.05), and the defects of microfracture in 7-week group

  9. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  10. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    on the existence of a quotient construction, allowing a property phi of a parallel system phi/A to be transformed into a sufficient and necessary quotient-property yolA to be satisfied by the component 13. Given a model checking problem involving a network Pi I and a property yo, the method gradually move (by...... quotienting) components Pi from the network into the formula co. Crucial to the success of the method is the ability to manage the size of the intermediate quotient-properties by a suitable collection of efficient minimization heuristics....

  11. Possibilities of 3-D modelling and quantitative morphometric analysis of decimeter-sized Echinoids using photogrammetric approach

    Science.gov (United States)

    Polonkai, Bálint; Görög, Ágnes; Raveloson, Andrea; Bodor, Emese; Székely, Balázs

    2017-04-01

    Echinoids (sea urchins) are useful fossils in palaeoenvironmental reconstruction for e.g. palaeobiogeography, palaeoclimatology or sedimentatological researches. In the Hungarian Badenian stage (Langhian, Middle Miocene) the species Parascutella gibbercula (DE SERRES 1829) is a common taxon and indicate shallow marine environment. The specimens of this extinct species show high morphological variability within relatively small geographical areas, even within one given strata. These differences can have a relevant palaeontological and/or palaeoenvironmental information. It is necessary for the interpretation of the value of the morphological parameters to quantify them in properties. Among the possible quantification methods 3D photogrammetric reconstruction is found to be suitable; recent years have seen its increasing palaeontological application both on invertebrates and vertebrates. In order to generate proper 3D models of the specimens with the required details a great number of digital images have to be shot. In case of proper data acquisition and precise model generation it is possible to outperform the traditional 2D morphometric studies of the echinoids that are often inaccurate when the spatial characters as well as ambulacral system and the conical shaped apex (top of the test) are measured. An average P . gibbercula specimen is about 10 cm diameter. Therefore, desktop image acquisition is possible if appropriate lighting conditions are provided. For better results we have designed an elaborate target background pattern that enhances the chances to find homologous points in the imagery. Agisoft Photoscan software has been used for the model generation. The generated models typically show high-resolution details and reproduce original colours. However, various problems may occur: improper focusing and/or poor lighting conditions may cause hardly patchable aboral and oral side, and/or shallow surface undulations cannot be modelled appropriately. Another

  12. Quantitative volumetric analysis of a retinoic acid induced hypoplastic model of chick thymus, using Image-J.

    Science.gov (United States)

    Haque, Ayesha; Khan, Muhammad Yunus

    2017-09-01

    To assess the total volume change in a retinoic acid-induced, hypoplastic model of a chick thymus using Image-J. This experimental study was carried out at the anatomy department of College of Physicians and Surgeons, Islamabad, Pakistan, from February 2009 to February 2010, and comprised fertilised chicken eggs. The eggs were divided into experimental group A and control group C. Group A was injected with 0.3µg of retinoic acid via yolk sac to induce a defective model of a thymus with hypoplasia. The chicks were sacrificed on embryonic day 15 and at hatching. The thymus of each animal was processed, serially sectioned and stained. The total area of each section of thymus was calculated using Image-J. This total area was summed and multiplied with the thickness of each section to obtain volume. Of the 120 eggs, there were 60(50%) in each group. Image analysis revealed a highly significant decrease in the volume of the chick thymus in the experimental group A than its matched control at the time of hatching (p=0.001). Moreover, volumetric depletion progressed with time, being substantially pronounced at hatching compared to the embryonic stage. The volume changes were significant and were effectively quantified using Image-J.

  13. Effect of human movement on airborne disease transmission in an airplane cabin: study using numerical modeling and quantitative risk analysis.

    Science.gov (United States)

    Han, Zhuyang; To, Gin Nam Sze; Fu, Sau Chung; Chao, Christopher Yu-Hang; Weng, Wenguo; Huang, Quanyi

    2014-08-06

    Airborne transmission of respiratory infectious disease in indoor environment (e.g. airplane cabin, conference room, hospital, isolated room and inpatient ward) may cause outbreaks of infectious diseases, which may lead to many infection cases and significantly influences on the public health. This issue has received more and more attentions from academics. This work investigates the influence of human movement on the airborne transmission of respiratory infectious diseases in an airplane cabin by using an accurate human model in numerical simulation and comparing the influences of different human movement behaviors on disease transmission. The Eulerian-Lagrangian approach is adopted to simulate the dispersion and deposition of the expiratory aerosols. The dose-response model is used to assess the infection risks of the occupants. The likelihood analysis is performed as a hypothesis test on the input parameters and different human movement pattern assumptions. An in-flight SARS outbreak case is used for investigation. A moving person with different moving speeds is simulated to represent the movement behaviors. A digital human model was used to represent the detailed profile of the occupants, which was obtained by scanning a real thermal manikin using the 3D laser scanning system. The analysis results indicate that human movement can strengthen the downward transport of the aerosols, significantly reduce the overall deposition and removal rate of the suspended aerosols and increase the average infection risk in the cabin. The likelihood estimation result shows that the risk assessment results better fit the outcome of the outbreak case when the movements of the seated passengers are considered. The intake fraction of the moving person is significantly higher than most of the seated passengers. The infection risk distribution in the airplane cabin highly depends on the movement behaviors of the passengers and the index patient. The walking activities of the crew

  14. Quantitative analysis of emphysema and airway measurements according to iterative reconstruction algorithms: comparison of filtered back projection, adaptive statistical iterative reconstruction and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Ji Yung [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Korea University Ansan Hospital, Ansan-si, Department of Radiology, Gyeonggi-do (Korea, Republic of); Goo, Jin Mo; Park, Chang Min; Park, Sang Joon [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University, Cancer Research Institute, Seoul (Korea, Republic of); Lee, Chang Hyun; Shim, Mi-Suk [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate filtered back projection (FBP) and two iterative reconstruction (IR) algorithms and their effects on the quantitative analysis of lung parenchyma and airway measurements on computed tomography (CT) images. Low-dose chest CT obtained in 281 adult patients were reconstructed using three algorithms: FBP, adaptive statistical IR (ASIR) and model-based IR (MBIR). Measurements of each dataset were compared: total lung volume, emphysema index (EI), airway measurements of the lumen and wall area as well as average wall thickness. Accuracy of airway measurements of each algorithm was also evaluated using an airway phantom. EI using a threshold of -950 HU was significantly different among the three algorithms in decreasing order of FBP (2.30 %), ASIR (1.49 %) and MBIR (1.20 %) (P < 0.01). Wall thickness was also significantly different among the three algorithms with FBP (2.09 mm) demonstrating thicker walls than ASIR (2.00 mm) and MBIR (1.88 mm) (P < 0.01). Airway phantom analysis revealed that MBIR showed the most accurate value for airway measurements. The three algorithms presented different EIs and wall thicknesses, decreasing in the order of FBP, ASIR and MBIR. Thus, care should be taken in selecting the appropriate IR algorithm on quantitative analysis of the lung. (orig.)

  15. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  16. Determining shell thicknesses in stabilised CdSe@ZnS core-shell nanoparticles by quantitative XPS analysis using an Infinitesimal Columns model

    Energy Technology Data Exchange (ETDEWEB)

    Kalbe, H., E-mail: henryk.kalbe@gmail.com; Rades, S.; Unger, W.E.S.

    2016-10-15

    Highlights: • A novel method to calculate shell thicknesses of core-shell nanoparticles from XPS data is presented. • The approach is widely applicable and combines advantages of existing models. • CdSe@ZnS quantum dots with additional organic stabiliser shell are analysed by XPS. • ZnS and organic shell thicknesses were calculated. • Potential as well as challenges of this and similar approaches are demonstrated. - Abstract: A novel Infinitesimal Columns (IC) simulation model is introduced in this study for the quantitative analysis of core-shell nanoparticles (CSNP) by means of XPS, which combines the advantages of existing approaches. The IC model is applied to stabilised Lumidot™ CdSe/ZnS 610 CSNP for an extensive investigation of their internal structure, i.e. calculation of the two shell thicknesses (ZnS and stabiliser) and exploration of deviations from the idealised CSNP composition. The observed discrepancies between different model calculations can be attributed to the presence of excess stabiliser as well as synthesis residues, demonstrating the necessity of sophisticated purification methods. An excellent agreement is found in the comparison of the IC model with established models from the existing literature, the Shard model and the software SESSA.

  17. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  18. Quantitative RNA-Seq analysis in non-model species: assessing transcriptome assemblies as a scaffold and the utility of evolutionary divergent genomic reference species

    Directory of Open Access Journals (Sweden)

    Hornett Emily A

    2012-08-01

    Full Text Available Abstract Background How well does RNA-Seq data perform for quantitative whole gene expression analysis in the absence of a genome? This is one unanswered question facing the rapidly growing number of researchers studying non-model species. Using Homo sapiens data and resources, we compared the direct mapping of sequencing reads to predicted genes from the genome with mapping to de novo transcriptomes assembled from RNA-Seq data. Gene coverage and expression analysis was further investigated in the non-model context by using increasingly divergent genomic reference species to group assembled contigs by unique genes. Results Eight transcriptome sets, composed of varying amounts of Illumina and 454 data, were assembled and assessed. Hybrid 454/Illumina assemblies had the highest transcriptome and individual gene coverage. Quantitative whole gene expression levels were highly similar between using a de novo hybrid assembly and the predicted genes as a scaffold, although mapping to the de novo transcriptome assembly provided data on fewer genes. Using non-target species as reference scaffolds does result in some loss of sequence and expression data, and bias and error increase with evolutionary distance. However, within a 100 million year window these effect sizes are relatively small. Conclusions Predicted gene sets from sequenced genomes of related species can provide a powerful method for grouping RNA-Seq reads and annotating contigs. Gene expression results can be produced that are similar to results obtained using gene models derived from a high quality genome, though biased towards conserved genes. Our results demonstrate the power and limitations of conducting RNA-Seq in non-model species.

  19. Quantitative analysis of planetary reflectance spectra with principal components analysis

    Science.gov (United States)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  20. Exhumation of the North Alpine Foreland Basin- Quantitative insights from structural analysis, thermochronology and a new thermal history model

    Science.gov (United States)

    Luijendijk, Elco; von Hagke, Christoph; Hindle, David

    2016-04-01

    Due to a wealth of geological and thermochronology data the northern foreland basin of the European Alps is an ideal natural laboratory for understanding the dynamics of foreland basins and their interaction with surface and geodynamic processes. We present an unprecedented compilation of thermochronological data from the basin and quantify cooling and exhumation rates in the basin by combining published and new vitrinite reflectance, apatite fission track and U-Th/He data with a new inverse burial and thermal history model. No correlation is obvious between inferred cooling and exhumation rates and elevation, relief or tectonics. We compare derived temperature histories to exhumation estimates based on the retro-deformation of Molasse basin and the Jura mountains, and to exhumation caused by drainage reorganization and incision. Drainage reorganization can explain at most 25% of the observed cooling rates in the basin. Tectonic transport of the basin's sediments over the inclined basement of the alpine foreland as the Jura mountains shortened can explain part of the cooling signal in the western part of the basin. However, overall a substantial amount of cooling and exhumation remains unexplained by known tectonic and surface processes. Our results document basin wide exhumation that may be related to slab roll-back or other lithospheric processes. Uncertainty analysis shows that thermochronometers can be explained by cooling and exhumation starting as early as the Miocene or as late as the Pleistocene. New (U-Th)/He data from key areas close to the Alpine front may provide better constraints on the timing of exhumation.

  1. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  2. Genomewide rapid association using mixed model and regression: a fast and simple method for genomewide pedigree-based quantitative trait loci association analysis.

    Science.gov (United States)

    Aulchenko, Yurii S; de Koning, Dirk-Jan; Haley, Chris

    2007-09-01

    For pedigree-based quantitative trait loci (QTL) association analysis, a range of methods utilizing within-family variation such as transmission-disequilibrium test (TDT)-based methods have been developed. In scenarios where stratification is not a concern, methods exploiting between-family variation in addition to within-family variation, such as the measured genotype (MG) approach, have greater power. Application of MG methods can be computationally demanding (especially for large pedigrees), making genomewide scans practically infeasible. Here we suggest a novel approach for genomewide pedigree-based quantitative trait loci (QTL) association analysis: genomewide rapid association using mixed model and regression (GRAMMAR). The method first obtains residuals adjusted for family effects and subsequently analyzes the association between these residuals and genetic polymorphisms using rapid least-squares methods. At the final step, the selected polymorphisms may be followed up with the full measured genotype (MG) analysis. In a simulation study, we compared type 1 error, power, and operational characteristics of the proposed method with those of MG and TDT-based approaches. For moderately heritable (30%) traits in human pedigrees the power of the GRAMMAR and the MG approaches is similar and is much higher than that of TDT-based approaches. When using tabulated thresholds, the proposed method is less powerful than MG for very high heritabilities and pedigrees including large sibships like those observed in livestock pedigrees. However, there is little or no difference in empirical power of MG and the proposed method. In any scenario, GRAMMAR is much faster than MG and enables rapid analysis of hundreds of thousands of markers.

  3. Verification of surface polarity of O-face ZnO(0 0 0 1{sup Macron }) by quantitative modeling analysis of Auger electron spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Su, C.W., E-mail: cwsu@mail.ncyu.edu.tw [Department of Electrophysics, National Chiayi University, 300 Syuefu Rd., Chiayi 60004, Taiwan (China); Huang, M.S.; Tsai, T.H.; Chang, S.C. [Department of Electrophysics, National Chiayi University, 300 Syuefu Rd., Chiayi 60004, Taiwan (China)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Quantitative Auger intensity ratios to predict macroscopic surface type to Zn-face or O-face can be obtained using hard sphere model and considering electron mean free paths. Black-Right-Pointing-Pointer Calculation of electron signals from 6-layer depth is the best condition in estimating Auger intensity ratios. Black-Right-Pointing-Pointer The ratio deviated from the estimation reference after surface treated by annealing or sputtering is classified to Zn-rich or O-rich surface. Black-Right-Pointing-Pointer A Zn-rich surface may exist on an O-face surface. Black-Right-Pointing-Pointer Surface type of a composite material can be quickly obtained by quantitative analysis of Auger intensity ratio. - Abstract: Is crystalline ZnO(0 0 0 1{sup Macron }) O-face surface believed to be enriched by Zn atoms? This study may get the answer. We proposed a simplified model to simulate surface concentration ratio on (0 0 0 1{sup Macron })-O or (0 0 0 1)-Zn surface based on the hard-sphere model. The simulation ratio was performed by integrating electron signals from the assumed Auger emission, in which the electron mean free path and relative atomic layer arrangements inside the different polarity ZnO crystal surface were considered as relevant parameters. After counting more than 100 experimental observations of Zn/O ratios, the high frequency peak ratio was found at around 0.428, which was near the value predicted by the proposed model using the IMFP database. The ratio larger than the peak value corresponds to that observed in the annealed samples. A downward trend of the ratio evaluated on the post-sputtering sample indicates the possibility of a Zn-enriched phase appearing on the annealed O-face surface. This phenomenon can further elucidate the O-deficiency debate on most ZnO materials.

  4. Quantitative analysis of a reinforcing fabric structure as a prerequisite for modelling the mechanical properties of composites

    Czech Academy of Sciences Publication Activity Database

    Košková, B.; Glogar, Petr; Černý, M.

    11(128) (2003), s. 11-17 ISSN 1212-1576 R&D Projects: GA ČR GA106/99/0096 Institutional research plan: CEZ:AV0Z3046908 Keywords : fabric reinforced composite * image analysis * spectral analysis Subject RIV: JI - Composite Materials

  5. A quantitative analysis of the F18 flight control system

    Science.gov (United States)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  6. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomical Concepts: A Quantitative Analysis

    Science.gov (United States)

    Hansen, John A.; Barnett, Michael; Makinster, James G.; Keating, Thomas

    2004-01-01

    The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…

  7. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  8. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made...

  9. Augmented multivariate image analysis applied to quantitative structure-activity relationship modeling of the phytotoxicities of benzoxazinone herbicides and related compounds on problematic weeds.

    Science.gov (United States)

    Freitas, Mirlaine R; Matias, Stella V B G; Macedo, Renato L G; Freitas, Matheus P; Venturin, Nelson

    2013-09-11

    Two of major weeds affecting cereal crops worldwide are Avena fatua L. (wild oat) and Lolium rigidum Gaud. (rigid ryegrass). Thus, development of new herbicides against these weeds is required; in line with this, benzoxazinones, their degradation products, and analogues have been shown to be important allelochemicals and natural herbicides. Despite earlier structure-activity studies demonstrating that hydrophobicity (log P) of aminophenoxazines correlates to phytotoxicity, our findings for a series of benzoxazinone derivatives do not show any relationship between phytotoxicity and log P nor with other two usual molecular descriptors. On the other hand, a quantitative structure-activity relationship (QSAR) analysis based on molecular graphs representing structural shape, atomic sizes, and colors to encode other atomic properties performed very accurately for the prediction of phytotoxicities of these compounds against wild oat and rigid ryegrass. Therefore, these QSAR models can be used to estimate the phytotoxicity of new congeners of benzoxazinone herbicides toward A. fatua L. and L. rigidum Gaud.

  10. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  11. Three-Dimensional Quantitative Morphometric Analysis (QMA for In Situ Joint and Tissue Assessment of Osteoarthritis in a Preclinical Rabbit Disease Model.

    Directory of Open Access Journals (Sweden)

    Kathryn S Stok

    Full Text Available This work utilises advances in multi-tissue imaging, and incorporates new metrics which define in situ joint changes and individual tissue changes in osteoarthritis (OA. The aims are to (1 demonstrate a protocol for processing intact animal joints for microCT to visualise relevant joint, bone and cartilage structures for understanding OA in a preclinical rabbit model, and (2 introduce a comprehensive three-dimensional (3D quantitative morphometric analysis (QMA, including an assessment of reproducibility. Sixteen rabbit joints with and without transection of the anterior cruciate ligament were scanned with microCT and contrast agents, and processed for histology. Semi-quantitative evaluation was performed on matching two-dimensional (2D histology and microCT images. Subsequently, 3D QMA was performed; including measures of cartilage, subchondral cortical and epiphyseal bone, and novel tibio-femoral joint metrics. Reproducibility of the QMA was tested on seven additional joints. A significant correlation was observed in cartilage thickness from matching histology-microCT pairs. The lateral compartment of operated joints had larger joint space width, thicker femoral cartilage and reduced bone volume, while osteophytes could be detected quantitatively. Measures between the in situ tibia and femur indicated an altered loading scenario. High measurement reproducibility was observed for all new parameters; with ICC ranging from 0.754 to 0.998. In conclusion, this study provides a novel 3D QMA to quantify macro and micro tissue measures in the joint of a rabbit OA model. New metrics were established consisting of: an angle to quantitatively measure osteophytes (σ, an angle to indicate erosion between the lateral and medial femoral condyles (ρ, a vector defining altered angulation (λ, α, β, γ and a twist angle (τ measuring instability and tissue degeneration between the femur and tibia, a length measure of joint space width (JSW, and a slope and

  12. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  13. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  14. Quantitative ADF STEM: acquisition, analysis and interpretation

    International Nuclear Information System (INIS)

    Jones, L

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations. (paper)

  15. Database design and implementation for quantitative image analysis research.

    Science.gov (United States)

    Brown, Matthew S; Shah, Sumit K; Pais, Richard C; Lee, Yeng-Zhong; McNitt-Gray, Michael F; Goldin, Jonathan G; Cardenas, Alfonso F; Aberle, Denise R

    2005-03-01

    Quantitative image analysis (QIA) goes beyond subjective visual assessment to provide computer measurements of the image content, typically following image segmentation to identify anatomical regions of interest (ROIs). Commercially available picture archiving and communication systems focus on storage of image data. They are not well suited to efficient storage and mining of new types of quantitative data. In this paper, we present a system that integrates image segmentation, quantitation, and characterization with database and data mining facilities. The paper includes generic process and data models for QIA in medicine and describes their practical use. The data model is based upon the Digital Imaging and Communications in Medicine (DICOM) data hierarchy, which is augmented with tables to store segmentation results (ROIs) and quantitative data from multiple experiments. Data mining for statistical analysis of the quantitative data is described along with example queries. The database is implemented in PostgreSQL on a UNIX server. Database requirements and capabilities are illustrated through two quantitative imaging experiments related to lung cancer screening and assessment of emphysema lung disease. The system can manage the large amounts of quantitative data necessary for research, development, and deployment of computer-aided diagnosis tools.

  16. Quantitative analysis and biophysically realistic neural modeling of the MEG mu rhythm: rhythmogenesis and modulation of sensory-evoked responses.

    Science.gov (United States)

    Jones, Stephanie R; Pritchett, Dominique L; Sikora, Michael A; Stufflebeam, Steven M; Hämäläinen, Matti; Moore, Christopher I

    2009-12-01

    Variations in cortical oscillations in the alpha (7-14 Hz) and beta (15-29 Hz) range have been correlated with attention, working memory, and stimulus detection. The mu rhythm recorded with magnetoencephalography (MEG) is a prominent oscillation generated by Rolandic cortex containing alpha and beta bands. Despite its prominence, the neural mechanisms regulating mu are unknown. We characterized the ongoing MEG mu rhythm from a localized source in the finger representation of primary somatosensory (SI) cortex. Subjects showed variation in the relative expression of mu-alpha or mu-beta, which were nonoverlapping for roughly 50% of their respective durations on single trials. To delineate the origins of this rhythm, a biophysically principled computational neural model of SI was developed, with distinct laminae, inhibitory and excitatory neurons, and feedforward (FF, representative of lemniscal thalamic drive) and feedback (FB, representative of higher-order cortical drive or input from nonlemniscal thalamic nuclei) inputs defined by the laminar location of their postsynaptic effects. The mu-alpha component was accurately modeled by rhythmic FF input at approximately 10-Hz. The mu-beta component was accurately modeled by the addition of approximately 10-Hz FB input that was nearly synchronous with the FF input. The relative dominance of these two frequencies depended on the delay between FF and FB drives, their relative input strengths, and stochastic changes in these variables. The model also reproduced key features of the impact of high prestimulus mu power on peaks in SI-evoked activity. For stimuli presented during high mu power, the model predicted enhancement in an initial evoked peak and decreased subsequent deflections. In agreement, the MEG-evoked responses showed an enhanced initial peak and a trend to smaller subsequent peaks. These data provide new information on the dynamics of the mu rhythm in humans and the model provides a novel mechanistic

  17. Applications of advanced kinetic collisional radiative modeling and Bremsstrahlung emission to quantitative impurity analysis on the National Spherical Torus Experiment

    Science.gov (United States)

    Muñoz Burgos, J. M.; Tritz, K.; Stutman, D.; Bell, R. E.; LeBlanc, B. P.; Sabbagh, S. A.

    2015-12-01

    An advanced kinetic collisional radiative model is used to predict beam into plasma charge-exchange visible and extreme UV (XUV ∽ 50 -700 Å ) light emission to quantify impurity density profiles on NSTX. This kinetic model is first benchmarked by predicting line-of-sight integrated emission for the visible λ = 5292.0 Å line of carbon (C VI n = 8 → 7), and comparing these predictions to absolute calibrated measurements from the active CHarge-Exchange Recombination Spectroscopy diagnostic (CHERS) on NSTX. Once benchmarked, the model is used to predict charge-exchange emission for the 182.1 Å line of carbon (C VI n = 3 → 2) that is used to scale Bremsstrahlung continuum emission in the UV/XUV region. The scaled Bremsstrahlung emission is used as a base to estimate an absolute intensity calibration curve of a XUV Transmission Grating-based Imaging Spectrometer (TGIS) diagnostic installed on the National Spherical Torus Experiment (NSTX and upgrade NSTX-U). The TGIS diagnostic operates in the wavelength region ∽ 50 -700 Å , and it is used to measure impurity spectra from charge-exchange emission. Impurity densities are estimated by fitting synthetic emission from the kinetic charge-exchange model to TGIS spectral measurements.

  18. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    Science.gov (United States)

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Simplification of complex kinetic models used for the quantitative analysis of nuclear magnetic resonance or radioactive tracer studies

    International Nuclear Information System (INIS)

    Schuster, R.; Schuster, S.; Holzhuetter, H.-G.

    1992-01-01

    A method for simplifying the mathematical models describing the dynamics of tracers (e.g. 13 C, 31 P, 14 C, as used in NMR studies or radioactive tracer experiments) in (bio-)chemical reaction systems is presented. This method is appropriate in the cases where the system includes reactions, the rates of which differ by several orders of magnitude. The basic idea is to adapt the rapid-equilibrium approximation to tracer systems. It is shown with the aid of the Perron-Frobenius theorem that for tracer systems, the conditions for applicability of this approximation are satisfied whenever some reactions are near equilibrium. It turns out that the specific enrichments of all of the labelled atoms that are connected by fast reversible reactions can be grouped together as 'pool variables'. The reduced system contains fewer parameters and can, thus, be fitted more easily to experimental data. Moreover, the method can be employed for identifying non-equilibrium and near-equilibrium reactions from experimentally measured specific enrichments of tracer. The reduction algorithm is illustrated by studying a model of the distribution of 13 C-tracers in the pentose phosphate pathway. (author)

  20. Quantitative genetic analysis of brain size variation in sticklebacks: support for the mosaic model of brain evolution.

    Science.gov (United States)

    Noreikiene, Kristina; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Husby, Arild; Merilä, Juha

    2015-07-07

    The mosaic model of brain evolution postulates that different brain regions are relatively free to evolve independently from each other. Such independent evolution is possible only if genetic correlations among the different brain regions are less than unity. We estimated heritabilities, evolvabilities and genetic correlations of relative size of the brain, and its different regions in the three-spined stickleback (Gasterosteus aculeatus). We found that heritabilities were low (average h(2) = 0.24), suggesting a large plastic component to brain architecture. However, evolvabilities of different brain parts were moderate, suggesting the presence of additive genetic variance to sustain a response to selection in the long term. Genetic correlations among different brain regions were low (average rG = 0.40) and significantly less than unity. These results, along with those from analyses of phenotypic and genetic integration, indicate a high degree of independence between different brain regions, suggesting that responses to selection are unlikely to be severely constrained by genetic and phenotypic correlations. Hence, the results give strong support for the mosaic model of brain evolution. However, the genetic correlation between brain and body size was high (rG = 0.89), suggesting a constraint for independent evolution of brain and body size in sticklebacks. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. A Quantitative Genomic Approach for Analysis of Fitness and Stress Related Traits in a Drosophila melanogaster Model Population

    Directory of Open Access Journals (Sweden)

    Palle Duun Rohde

    2016-01-01

    Full Text Available The ability of natural populations to withstand environmental stresses relies partly on their adaptive ability. In this study, we used a subset of the Drosophila Genetic Reference Panel, a population of inbred, genome-sequenced lines derived from a natural population of Drosophila melanogaster, to investigate whether this population harbors genetic variation for a set of stress resistance and life history traits. Using a genomic approach, we found substantial genetic variation for metabolic rate, heat stress resistance, expression of a major heat shock protein, and egg-to-adult viability investigated at a benign and a higher stressful temperature. This suggests that these traits will be able to evolve. In addition, we outline an approach to conduct pathway associations based on genomic linear models, which has potential to identify adaptive genes and pathways, and therefore can be a valuable tool in conservation genomics.

  2. A Quantitative Genomic Approach for Analysis of Fitness and Stress Related Traits in a Drosophila melanogaster Model Population

    DEFF Research Database (Denmark)

    Rohde, Palle Duun; Krag, Kristian; Loeschcke, Volker

    2016-01-01

    , to investigate whether this population harbors genetic variation for a set of stress resistance and life history traits. Using a genomic approach, we found substantial genetic variation for metabolic rate, heat stress resistance, expression of a major heat shock protein, and egg-to-adult viability investigated......The ability of natural populations to withstand environmental stresses relies partly on their adaptive ability. In this study, we used a subset of the Drosophila Genetic Reference Panel, a population of inbred, genome-sequenced lines derived from a natural population of Drosophila melanogaster...... at a benign and a higher stressful temperature. This suggests that these traits will be able to evolve. In addition, we outline an approach to conduct pathway associations based on genomic linear models, which has potential to identify adaptive genes and pathways, and therefore can be a valuable tool...

  3. Label-free quantitative analysis of the casein kinase 2-responsive phosphoproteome of the marine minimal model species Ostreococcus tauri.

    Science.gov (United States)

    Le Bihan, Thierry; Hindle, Matthew; Martin, Sarah F; Barrios-Llerena, Martin E; Krahmer, Johanna; Kis, Katalin; Millar, Andrew J; van Ooijen, Gerben

    2015-12-01

    Casein kinase 2 (CK2) is a protein kinase that phosphorylates a plethora of cellular target proteins involved in processes including DNA repair, cell cycle control, and circadian timekeeping. CK2 is functionally conserved across eukaryotes, although the substrate proteins identified in a range of complex tissues are often different. The marine alga Ostreococcus tauri is a unicellular eukaryotic model organism ideally suited to efficiently study generic roles of CK2 in the cellular circadian clock. Overexpression of CK2 leads to a slow circadian rhythm, verifying functional conservation of CK2 in timekeeping. The proteome was analysed in wild-type and CK2-overexpressing algae at dawn and dusk, revealing that differential abundance of the global proteome across the day is largely unaffected by overexpression. However, CK2 activity contributed more strongly to timekeeping at dusk than at dawn. The phosphoproteome of a CK2 overexpression line and cells treated with CK2 inhibitor was therefore analysed and compared to control cells at dusk. We report an extensive catalogue of 447 unique CK2-responsive differential phosphopeptide motifs to inform future studies into CK2 activity in the circadian clock of more complex tissues. All MS data have been deposited in the ProteomeXchange with identifier PXD000975 (http://proteomecentral.proteomexchange.org/dataset/PXD000975). © 2015 The Authors. PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Thrombin effectuates therapeutic arteriogenesis in the rabbit hindlimb ischemia model: A quantitative analysis by computerized in vivo imaging

    International Nuclear Information System (INIS)

    Kagadis, George C.; Karnabatidis, Dimitrios; Katsanos, Konstantinos; Diamantopoulos, Athanassios; Samaras, Nikolaos; Maroulis, John; Siablis, Dimitrios; Nikiforidis, George C.

    2006-01-01

    We report on an experimental mammalian controlled study that documents arteriogenic capacity of thrombin and utilizes computerized algorithms to quantify the newly formed vessels. Hindlimb ischemia was surgically invoked in 10 New Zealand white rabbits. After quiescence of endogenous angiogenesis heterologous bovine thrombin was intramuscularly injected (1500 units) in one hindlimb per rabbit (Group T). Contralateral limbs were infused with normal saline (Group C). Digital subtraction angiography (DSA) of both limbs was performed after thrombin infusion by selective cannulation of the abdominal aorta and digital images were post-processed with computerized algorithms in order to enhance newly formed vessels. Total vessel area and total vessel length were quantified. In vivo functional evaluation included measurements of blood flow volume at the level of the external iliac artery by Doppler ultrasonography both at baseline and at 20 days after thrombin infusion. Total vessel area and length (in pixels) were 14,713+/-1023 and 5466+/-1327 in group T versus 12,015+/-2557 and 4598+/-1269 in group C (p=0.0062 and 0.1526, respectively). Blood flow volumes (ml/min) at baseline and at 20 days after thrombin infusion were 25.87+/-11.09 and 38.06+/-11.72 in group T versus 26.57+/-11.19 and 20.35+/-7.20 in group C (p=0.8898 and 0.0007, respectively). Intramuscular thrombin effectuates an arteriogenic response in the rabbit hindlimb ischemia model. Computerized algorithms may enable accurate quantification of the neovascularization outcome

  5. Spherical blurred shape model for 3-D object and pose recognition: quantitative analysis and HCI applications in smart environments.

    Science.gov (United States)

    Lopes, Oscar; Reyes, Miguel; Escalera, Sergio; Gonzàlez, Jordi

    2014-12-01

    The use of depth maps is of increasing interest after the advent of cheap multisensor devices based on structured light, such as Kinect. In this context, there is a strong need of powerful 3-D shape descriptors able to generate rich object representations. Although several 3-D descriptors have been already proposed in the literature, the research of discriminative and computationally efficient descriptors is still an open issue. In this paper, we propose a novel point cloud descriptor called spherical blurred shape model (SBSM) that successfully encodes the structure density and local variabilities of an object based on shape voxel distances and a neighborhood propagation strategy. The proposed SBSM is proven to be rotation and scale invariant, robust to noise and occlusions, highly discriminative for multiple categories of complex objects like the human hand, and computationally efficient since the SBSM complexity is linear to the number of object voxels. Experimental evaluation in public depth multiclass object data, 3-D facial expressions data, and a novel hand poses data sets show significant performance improvements in relation to state-of-the-art approaches. Moreover, the effectiveness of the proposal is also proved for object spotting in 3-D scenes and for real-time automatic hand pose recognition in human computer interaction scenarios.

  6. Kinetic Modeling of ABCG2 Transporter Heterogeneity: A Quantitative, Single-Cell Analysis of the Side Population Assay.

    Directory of Open Access Journals (Sweden)

    Adam F Prasanphanich

    2016-11-01

    Full Text Available The side population (SP assay, a technique used in cancer and stem cell research, assesses the activity of ABC transporters on Hoechst staining in the presence and absence of transporter inhibition, identifying SP and non-SP cell (NSP subpopulations by differential staining intensity. The interpretation of the assay is complicated because the transporter-mediated mechanisms fail to account for cell-to-cell variability within a population or adequately control the direct role of transporter activity on staining intensity. We hypothesized that differences in dye kinetics at the single-cell level, such as ABCG2 transporter-mediated efflux and DNA binding, are responsible for the differential cell staining that demarcates SP/NSP identity. We report changes in A549 phenotype during time in culture and with TGFβ treatment that correlate with SP size. Clonal expansion of individually sorted cells re-established both SP and NSPs, indicating that SP membership is dynamic. To assess the validity of a purely kinetics-based interpretation of SP/NSP identity, we developed a computational approach that simulated cell staining within a heterogeneous cell population; this exercise allowed for the direct inference of the role of transporter activity and inhibition on cell staining. Our simulated SP assay yielded appropriate SP responses for kinetic scenarios in which high transporter activity existed in a portion of the cells and little differential staining occurred in the majority of the population. With our approach for single-cell analysis, we observed SP and NSP cells at both ends of a transporter activity continuum, demonstrating that features of transporter activity as well as DNA content are determinants of SP/NSP identity.

  7. Kinetic Modeling of ABCG2 Transporter Heterogeneity: A Quantitative, Single-Cell Analysis of the Side Population Assay

    Science.gov (United States)

    Prasanphanich, Adam F.; White, Douglas E.; Gran, Margaret A.

    2016-01-01

    The side population (SP) assay, a technique used in cancer and stem cell research, assesses the activity of ABC transporters on Hoechst staining in the presence and absence of transporter inhibition, identifying SP and non-SP cell (NSP) subpopulations by differential staining intensity. The interpretation of the assay is complicated because the transporter-mediated mechanisms fail to account for cell-to-cell variability within a population or adequately control the direct role of transporter activity on staining intensity. We hypothesized that differences in dye kinetics at the single-cell level, such as ABCG2 transporter-mediated efflux and DNA binding, are responsible for the differential cell staining that demarcates SP/NSP identity. We report changes in A549 phenotype during time in culture and with TGFβ treatment that correlate with SP size. Clonal expansion of individually sorted cells re-established both SP and NSPs, indicating that SP membership is dynamic. To assess the validity of a purely kinetics-based interpretation of SP/NSP identity, we developed a computational approach that simulated cell staining within a heterogeneous cell population; this exercise allowed for the direct inference of the role of transporter activity and inhibition on cell staining. Our simulated SP assay yielded appropriate SP responses for kinetic scenarios in which high transporter activity existed in a portion of the cells and little differential staining occurred in the majority of the population. With our approach for single-cell analysis, we observed SP and NSP cells at both ends of a transporter activity continuum, demonstrating that features of transporter activity as well as DNA content are determinants of SP/NSP identity. PMID:27851764

  8. Sleep/Wake Physiology and Quantitative Electroencephalogram Analysis of the Neuroligin-3 Knockout Rat Model of Autism Spectrum Disorder.

    Science.gov (United States)

    Thomas, Alexia M; Schwartz, Michael D; Saxe, Michael D; Kilduff, Thomas S

    2017-10-01

    Neuroligin-3 (NLGN3) is one of the many genes associated with autism spectrum disorder (ASD). Sleep dysfunction is highly prevalent in ASD, but has not been rigorously examined in ASD models. Here, we evaluated sleep/wake physiology and behavioral phenotypes of rats with genetic ablation of Nlgn3. Male Nlgn3 knockout (KO) and wild-type (WT) rats were assessed using a test battery for ASD-related behaviors and also implanted with telemeters to record the electroencephalogram (EEG), electromyogram, body temperature, and locomotor activity. 24-h EEG recordings were analyzed for sleep/wake states and spectral composition. Nlgn3 KO rats were hyperactive, exhibited excessive chewing behavior, and had impaired prepulse inhibition to an auditory startle stimulus. KO rats also spent less time in non-rapid eye movement (NREM) sleep, more time in rapid eye movement (REM) sleep, exhibited elevated theta power (4-9 Hz) during wakefulness and REM, and elevated delta power (0.5-4 Hz) during NREM. Beta (12-30 Hz) power and gamma (30-50 Hz) power were suppressed across all vigilance states. The sleep disruptions in Nlgn3 KO rats are consistent with observations of sleep disturbances in ASD patients. The EEG provides objective measures of brain function to complement rodent behavioral analyses and therefore may be a useful tool to study ASD. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  9. Host inflammatory response to polypropylene implants: insights from a quantitative immunohistochemical and birefringence analysis in a rat subcutaneous model

    Science.gov (United States)

    Prudente, Alessandro; Fávaro, Wágner José; Latuf, Paulo; Riccetto, Cássio Luis Zanettini

    2016-01-01

    ABSTRACT Objectives To describe acute and sub acute aspects of histological and immunohistochemical response to PP implant in a rat subcutaneous model based on objective methods. Materials and Methods Thirty rats had a PP mesh subcutaneously implanted and the same dissection on the other side of abdomen but without mesh (sham). The animals were euthanized after 4 and 30 days. Six slides were prepared using the tissue removed: one stained with hematoxylin-eosin (inflammation assessment); one unstained (birefringence evaluation) and four slides for immunohistochemical processing: IL-1 and TNF-α (pro-inflammatory cytokines), MMP-2 (collagen metabolism) and CD-31 (angiogenesis). The area of inflammation, the birefringence index, the area of immunoreactivity and the number of vessels were objectively measured. Results A larger area of inflammatory reaction was observed in PP compared to sham on the 4th and on the 30th day (p=0.0002). After 4 days, PP presented higher TNF (p=0.0001) immunoreactivity than sham and no differences were observed in MMP-2 (p=0.06) and IL-1 (p=0.08). After 30 days, a reduction of IL-1 (p=0.010) and TNF (p=0.016) for PP and of IL-1 (p=0.010) for sham were observed. Moreover, area of MMP-2 immunoreactivity decreased over time for PP group (p=0.018). Birefringence index and vessel counting showed no differences between PP and sham (p=0.27 and p=0.58, respectively). Conclusions The implantation of monofilament and macroporous polypropylene in the subcutaneous of rats resulted in increased inflammatory activity and higher TNF production in the early post implant phase. After 30 days, PP has similar cytokines immunoreactivity, vessel density and extracellular matrix organization. PMID:27286125

  10. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  11. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Semiautomated quantitative image analysis of glomerular immunohistochemistry markers desmin, vimentin, podocin, synaptopodin and WT-1 in acute and chronic rat kidney disease models.

    Science.gov (United States)

    Funk, J; Ott, V; Herrmann, A; Rapp, W; Raab, S; Riboulet, W; Vandjour, A; Hainaut, E; Benardeau, A; Singer, T; Jacobsen, B

    2016-03-01

    Five different glomerular immunohistochemistry markers were evaluated and compared in four different acute and chronic rat kidney disease models. Progression of glomerular or podocyte damage was shown in the puromycin aminonucleoside nephrosis (PAN) and Zucker fatty/spontaneously hypertensive heart failure F1 hybrid (ZSF1) rat model. Progression and prevention of glomerular damage was demonstrated in the Zucker diabetic fatty (ZDF) and Dahl salt-sensitive (Dahl SS) rat. Immunohistochemistry was performed for desmin, vimentin, podocin, synaptopodin and Wilms tumor protein-1 (WT-1), and evaluation of glomerular immunohistochemistry markers was done by semiautomated quantitative image analysis. We found desmin and WT-1 as the most sensitive markers for podocyte damage in both acute and chronic glomerular damage followed by vimentin, podocin and synaptopodin. We were able to demonstrate that early podocyte damage as shown by increased desmin and vimentin staining together with either a phenotypic podocyte change or podocyte loss (reduced numbers of WT-1-stained podocytes) drives the progression of glomerular damage. This is followed by a reduction in podocyte-specific proteins such as podocin and synaptopodin. Our report describes the different sensitivity of glomerular or podocyte markers and gives future guidance for the selection of the most sensitive markers for efficacy testing of new drugs as well as for the selection of tissue-based toxicity markers for glomerular or podocyte injury. In addition to functional clinical chemistry markers, desmin and WT-1 immunohistochemistry offers reliable and valuable data on the morphologic state of podocytes.

  13. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  14. Genomewide rapid association using mixed model and regression: A fast and simple method for genomewide pedigree-based quantitative trait loci association analysis

    NARCIS (Netherlands)

    Y.S. Aulchenko (Yurii); D.-J. de Koning; C. Haley (Chris)

    2007-01-01

    textabstractFor pedigree-based quantitative trait loci (QTL) association analysis, a range of methods utilizing within-family variation such as transmission- disequilibrium test (TDT)-based methods have been developed. In scenarios where stratification is not a concern, methods exploiting

  15. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  16. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  17. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  18. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  19. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  20. An in vitro biofilm model associated to dental implants: structural and quantitative analysis of in vitro biofilm formation on different dental implant surfaces.

    Science.gov (United States)

    Sánchez, M C; Llama-Palacios, A; Fernández, E; Figuero, E; Marín, M J; León, R; Blanc, V; Herrera, D; Sanz, M

    2014-10-01

    The impact of implant surfaces in dental biofilm development is presently unknown. The aim of this investigation was to assess in vitro the development of a complex biofilm model on titanium and zirconium implant surfaces, and to compare it with the same biofilm formed on hydroxyapatite surface. Six standard reference strains were used to develop an in vitro biofilm over sterile titanium, zirconium and hydroxyapatite discs, coated with saliva within the wells of pre-sterilized polystyrene tissue culture plates. The selected species used represent initial (Streptococcus oralis and Actinomyces naeslundii), early (Veillonella parvula), secondary (Fusobacterium nucleatum) and late colonizers (Porphyromonas gingivalis and Aggregatibacter actinomycetemcomitans). The developed biofilms (growth time 1 to 120h) were studied with confocal laser scanning microscopy using a vital fluorescence technique and with low-temperature scanning electron microscopy. The number (colony forming units/biofilm) and kinetics of the bacteria within the biofilm were studied with quantitative PCR (qPCR). As outcome variables, the biofilm thickness, the percentage of cell vitality and the number of bacteria were compared using the analysis of variance. The bacteria adhered and matured within the biofilm over the three surfaces with similar dynamics. Different surfaces, however, demonstrated differences both in the thickness, deposition of the extracellular polysaccharide matrix as well as in the organization of the bacterial cells. While the formation and dynamics of an in vitro biofilm model was similar irrespective of the surface of inoculation (hydroxyapatite, titanium or zirconium), there were significant differences in regards to the biofilm thickness and three-dimensional structure. Copyright © 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  1. Quantitative analysis of retinal changes in hypertension

    Science.gov (United States)

    Giansanti, Roberto; Boemi, Massimo; Fumelli, Paolo; Passerini, Giorgio; Zingaretti, Primo

    1995-05-01

    Arterial hypertension is a high prevalence disease in Western countries and it is associated with increased risk for cardiovascular accidents. Retinal vessel changes are common findings in patients suffering from long-standing hypertensive disease. Morphological evaluations of the fundus oculi represent a fundamental tool for the clinical approach to the patient with hypertension. A qualitative analysis of the retinal lesions is usually performed and this implies severe limitations both in the classification of the different degrees of the pathology and in the follow-up of the disease. A diagnostic system based on a quantitative analysis of the retinal changes could overcome these problems. Our computerized approach was intended for this scope. The paper concentrates on the results and the implications of a computerized approach to the automatic extraction of numerical indexes describing morphological details of the fundus oculi. A previously developed image processing and recognition system, documented elsewhere and briefly described here, was successfully tested in pre-clinical experiments and applied in the evaluation of normal as well as of pathological fundus. The software system was developed to extract indexes such as caliber and path of vessels, local tortuosity of arteries and arterioles, positions and angles of crossings between two vessels. The reliability of the results, justified by their low variability, makes feasible the standardization of quantitative parameters to be used both in the diagnosis and in the prognosis of hypertension, and also allows prospective studies based upon them.

  2. Impact of a Novel, Anti-microbial Dressing on In Vivo, Pseudomonas aeruginosa Wound Biofilm: Quantitative Comparative Analysis using a Rabbit Ear Model

    Science.gov (United States)

    2014-12-01

    Impact of a novel, antimicrobial dressing on in vivo, Pseudomonas aeruginosa wound biofilm : Quantitative comparative analysis using a rabbit ear...Hospital, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, Hubei, China, and 3. Microbiology Branch, US Army Dental and...Manuscript received: April 18, 2014 Accepted in final form: September 4, 2014 DOI:10.1111/wrr.12232 ABSTRACT The importance of bacterial biofilms to

  3. Global Quantitative Modeling of Chromatin Factor Interactions

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  4. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  5. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  6. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  7. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations....... We present an algorithm for the translation of such models into Markov Decision processes expressed in the syntax of the PRISM model checker. This enables analysis of business processes for the following properties: transient and steadystate probabilities, the timing, occurrence and ordering...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  8. Physiologically based quantitative modeling of unihemispheric sleep.

    Science.gov (United States)

    Kedziora, D J; Abeysuriya, R G; Phillips, A J K; Robinson, P A

    2012-12-07

    Unihemispheric sleep has been observed in numerous species, including birds and aquatic mammals. While knowledge of its functional role has been improved in recent years, the physiological mechanisms that generate this behavior remain poorly understood. Here, unihemispheric sleep is simulated using a physiologically based quantitative model of the mammalian ascending arousal system. The model includes mutual inhibition between wake-promoting monoaminergic nuclei (MA) and sleep-promoting ventrolateral preoptic nuclei (VLPO), driven by circadian and homeostatic drives as well as cholinergic and orexinergic input to MA. The model is extended here to incorporate two distinct hemispheres and their interconnections. It is postulated that inhibitory connections between VLPO nuclei in opposite hemispheres are responsible for unihemispheric sleep, and it is shown that contralateral inhibitory connections promote unihemispheric sleep while ipsilateral inhibitory connections promote bihemispheric sleep. The frequency of alternating unihemispheric sleep bouts is chiefly determined by sleep homeostasis and its corresponding time constant. It is shown that the model reproduces dolphin sleep, and that the sleep regimes of humans, cetaceans, and fur seals, the latter both terrestrially and in a marine environment, require only modest changes in contralateral connection strength and homeostatic time constant. It is further demonstrated that fur seals can potentially switch between their terrestrial bihemispheric and aquatic unihemispheric sleep patterns by varying just the contralateral connection strength. These results provide experimentally testable predictions regarding the differences between species that sleep bihemispherically and unihemispherically. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  10. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  11. Quantitative MR Image Analysis for Brian Tumor.

    Science.gov (United States)

    Shboul, Zeina A; Reza, Sayed M S; Iftekharuddin, Khan M

    2018-01-01

    This paper presents an integrated quantitative MR image analysis framework to include all necessary steps such as MRI inhomogeneity correction, feature extraction, multiclass feature selection and multimodality abnormal brain tissue segmentation respectively. We first obtain mathematical algorithm to compute a novel Generalized multifractional Brownian motion (GmBm) texture feature. We then demonstrate efficacy of multiple multiresolution texture features including regular fractal dimension (FD) texture, and stochastic texture such as multifractional Brownian motion (mBm) and GmBm features for robust tumor and other abnormal tissue segmentation in brain MRI. We evaluate these texture and associated intensity features to effectively delineate multiple abnormal tissues within and around the tumor core, and stroke lesions using large scale public and private datasets.

  12. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  13. Quantitative Analysis of Tremors in Welders

    Directory of Open Access Journals (Sweden)

    Paul A. Nausieda

    2011-05-01

    Full Text Available Background: Workers chronically exposed to manganese in welding fumes may develop an extra-pyramidal syndrome with postural and action tremors. Objectives: To determine the utility of tremor analysis in distinguishing tremors among workers exposed to welding fumes, patients with Idiopathic Parkinson’s Disease (IPD and Essential Tremor (ET. Methods: Retrospective study of recorded tremor in subjects from academic Movement Disorders Clinics and Welders. Quantitative tremor analysis was performed and associated with clinical status. Results: Postural tremor intensity was increased in Welders and ET and was associated with visibly greater amplitude of tremor with arms extended. Mean center frequencies (Cf of welders and patients with ET were significantly higher than the mean Cf of PD subjects. Although both the welders and the ET group exhibited a higher Cf with arms extended, welders could be distinguished from the ET subjects by a significantly lower Cf of the rest tremor than that measured in ET subjects. Conclusions: In the context of an appropriate exposure history and neurological examination, tremor analysis may be useful in the diagnosis of manganese-related extra-pyramidal manifestations.

  14. Longitudinal Association Analysis of Quantitative Traits

    Science.gov (United States)

    Fan, Ruzong; Zhang, Yiwei; Albert, Paul S.; Liu, Aiyi; Wang, Yuanjia; Xiong, Momiao

    2015-01-01

    Longitudinal genetic studies provide a valuable resource for exploring key genetic and environmental factors that affect complex traits over time. Genetic analysis of longitudinal data that incorporate temporal variations is important for understanding genetic architecture and biological variations of common complex diseases. Although they are important, there is a paucity of statistical methods to analyze longitudinal human genetic data. In this article, longitudinal methods are developed for temporal association mapping to analyze population longitudinal data. Both parametric and nonparametric models are proposed. The models can be applied to multiple diallelic genetic markers such as single-nucleotide polymorphisms and multiallelic markers such as microsatellites. By analytical formulae, we show that the models take both the linkage disequilibrium and temporal trends into account simultaneously. Variance-covariance structure is constructed to model the single measurement variation and multiple measurement correlations of an individual based on the theory of stochastic processes. Novel penalized spline models are used to estimate the time-dependent mean functions and regression coefficients. The methods were applied to analyze Framingham Heart Study data of Genetic Analysis Workshop (GAW) 13 and GAW 16. The temporal trends and genetic effects of the systolic blood pressure are successfully detected by the proposed approaches. Simulation studies were performed to find out that the nonparametric penalized linear model is the best choice in fitting real data. The research sheds light on the important area of longitudinal genetic analysis, and it provides a basis for future methodological investigations and practical applications. PMID:22965819

  15. Quantitative analysis of infrared contrast enhancement algorithms

    Science.gov (United States)

    Weith-Glushko, Seth; Salvaggio, Carl

    2007-04-01

    Dynamic range reduction and contrast enhancement are two image-processing methods that are required when developing thermal camera systems. The two methods must be performed in such a way that the high dynamic range imagery output from current sensors are compressed in a pleasing way for display on lower dynamic range monitors. This research examines a quantitative analysis of infrared contrast enhancement algorithms found in literature and developed by the author. Four algorithms were studied, three of which were found in literature and one developed by the author: tail-less plateau equalization (TPE), adaptive plateau equalization (APE), the method according to Aare Mällo (MEAM), and infrared multi-scale retinex (IMSR). TPE and APE are histogram-based methods, requiring the calculation of the probability density of digital counts within an image. MEAM and IMSR are frequency-domain methods, methods that operate on input imagery that has been split into components containing differing spatial frequency content. After a rate of growth analysis and psychophysical trial were performed, MEAM was found to be the best algorithm.

  16. Qualitative and quantitative analysis of lignocellulosic biomass using infrared spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Adapa, P.K.; Tabil, L.G. [Saskatchewan Univ., Saskatoon, SK (Canada). Dept. of Agricultural and Bioresource Engineering; Karunakaran, C. [Saskatchewan Univ., Saskatoon, SK (Canada). Canadian Light Source; Schoenau, G.J. [Saskatchewan Univ., Saskatoon, SK (Canada). Dept. of Mechanical Engineering

    2009-07-01

    Agricultural biomass or lignocellulosic residues such as barley, canola, oat and wheat straw have the potential to be used as feedstock for the biofuel industry. Low bulk density straw has to be processed and densified after harvest in order to facilitate efficient handling and transportation, and realize any potential economic benefits. Preliminary predictive models were developed to calculate the quantity of lignocellulosic components (cellulose, hemicelluloses and lignin) of agricultural biomass (barley, canola, oat and wheat straw) by using Fourier transform infrared spectroscopy (FT-IR). It was important to estimate critical parameters through analytical specification of lignocellulosic biomass and consequently the development and validation of a procedure for the qualitative and quantitative analysis of cellulose-hemicellulose-lignin composition. The paper discussed the materials and methods, including sample material preparation; reference material preparation; measured data; FT-IR equipment; and quantitative analysis. 54 refs., 5 tabs., 10 figs.

  17. Analysis of correlation and causes for discrepancy between quantitative and semi-quantitative Doppler scores in synovitis in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Rezaei, Hamed; Af Klint, Erik; Hammer, Hilde Berner

    2017-01-01

    OBJECTIVES: The aim of this study was to evaluate the association between two semi-quantitative Doppler US scoring systems (SQS), and the quantitative scoring (QS) of Doppler pixel count. METHODS: Adult patients with RA and inadequate clinical response to anti-rheumatic therapy were examined...... correlations and multilevel models taking into account the clustering of ratings at the rater, patient and joint levels. RESULTS: Analysis of the 1190 ratings revealed a strong correlation (ρ = 0.89, P

  18. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    Science.gov (United States)

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  19. Toward quantitative modeling of silicon phononic thermocrystals

    Energy Technology Data Exchange (ETDEWEB)

    Lacatena, V. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France); IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Haras, M.; Robillard, J.-F., E-mail: jean-francois.robillard@isen.iemn.univ-lille1.fr; Dubois, E. [IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Monfray, S.; Skotnicki, T. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France)

    2015-03-16

    The wealth of technological patterning technologies of deca-nanometer resolution brings opportunities to artificially modulate thermal transport properties. A promising example is given by the recent concepts of 'thermocrystals' or 'nanophononic crystals' that introduce regular nano-scale inclusions using a pitch scale in between the thermal phonons mean free path and the electron mean free path. In such structures, the lattice thermal conductivity is reduced down to two orders of magnitude with respect to its bulk value. Beyond the promise held by these materials to overcome the well-known “electron crystal-phonon glass” dilemma faced in thermoelectrics, the quantitative prediction of their thermal conductivity poses a challenge. This work paves the way toward understanding and designing silicon nanophononic membranes by means of molecular dynamics simulation. Several systems are studied in order to distinguish the shape contribution from bulk, ultra-thin membranes (8 to 15 nm), 2D phononic crystals, and finally 2D phononic membranes. After having discussed the equilibrium properties of these structures from 300 K to 400 K, the Green-Kubo methodology is used to quantify the thermal conductivity. The results account for several experimental trends and models. It is confirmed that the thin-film geometry as well as the phononic structure act towards a reduction of the thermal conductivity. The further decrease in the phononic engineered membrane clearly demonstrates that both phenomena are cumulative. Finally, limitations of the model and further perspectives are discussed.

  20. Quantitative analysis of tyrosinase transcripts in blood.

    Science.gov (United States)

    Johansson, M; Pisa, E K; Törmänen, V; Arstrand, K; Kågedal, B

    2000-07-01

    Tyrosinase is an enzyme unique to pigment-forming cells. Methods using this transcript for detection of melanoma cells in blood have given divergent results. Quantitative analytical procedures are therefore needed to study the analytical performance of the methods. Mononucleated cells were isolated by Percoll centrifugation. RNA was isolated by each of three methods: Ultraspec(TM)-II RNA isolation system, FastRNA(TM) GREEN Kit, and QIAamp RNA Blood Mini Kit. cDNA was synthesized using random hexamer primers. A tyrosinase-specific product of 207 bp was amplified by PCR. As an internal standard (and competitor) we used a 207-bp cDNA with a base sequence identical to the tyrosinase target except for a 20-bp probe-binding region. The PCR products were identified by 2, 4-dinitrophenol (DNP)-labeled probes specific for tyrosinase (5'DNP-GGGGAGCCTTGGGGTTCTGG-3') and internal standard (5'DNP-CGGAGCCCCGAAACCACATC-3') and quantified by ELISA. The calibration curves were linear and had a broad dynamic measuring range. A detection limit (2 SD above zero) of 48 transcripts/mL of blood was obtained from a low control. The analytical imprecision was 50% and 48% at concentrations of 1775 and 17 929 transcripts/mL (n = 12 and 14, respectively). With the cell line SK-Mel 28 added to blood and RNA extracted with the Ultraspec, Fast RNA, and QIAamp RNA methods, we found (mean +/- SD) 1716+/-1341, 2670+/-3174, and 24 320+/-5332 transcripts/mL of blood. Corresponding values were 527+/-497, 2497+/-1033, 14 930+/-1927 transcripts/mL of blood when the cell line JKM86-4 was added. One high-risk patient was followed by repeated analysis of tyrosinase transcripts in blood. The melanoma marker 5-S-cysteinyldopa in serum and urine was within reference values, but tyrosinase mRNA was slightly increased (120-168 transcripts/mL of blood). The tyrosinase mRNA increased to 1860 transcripts/mL concomitant with the increase in 5-S-cysteinyldopa; later a spleen metastasis was found. The results

  1. A Quantitative Model of Expert Transcription Typing

    Science.gov (United States)

    1993-03-08

    1-3), how degradation of the text away from normal prose affects the rate of typing (phenomena 4-6), patterns of interkey intervals (phenomena 7-11...A more detailed analysis of this phenomenon is based on the work of West and Sabban (1932). They used progressively degraded copy to test "the...company: Analytic modelling applied to real-world problems. In D. Diaper , D. Gilmore, G. Cockton, & B. Shackel (Eds.). Human-Computer Interaction INTERACT

  2. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  3. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  4. Quantitative analysis of professionally trained versus untrained voices.

    Science.gov (United States)

    Siupsinskiene, Nora

    2003-01-01

    The aim of this study was to compare healthy trained and untrained voices as well as healthy and dysphonic trained voices in adults using combined voice range profile and aerodynamic tests, to define the normal range limiting values of quantitative voice parameters and to select the most informative quantitative voice parameters for separation between healthy and dysphonic trained voices. Three groups of persons were evaluated. One hundred eighty six healthy volunteers were divided into two groups according to voice training: non-professional speakers group consisted of 106 untrained voices persons (36 males and 70 females) and professional speakers group--of 80 trained voices persons (21 males and 59 females). Clinical group consisted of 103 dysphonic professional speakers (23 males and 80 females) with various voice disorders. Eighteen quantitative voice parameters from combined voice range profile (VRP) test were analyzed: 8 of voice range profile, 8 of speaking voice, overall vocal dysfunction degree and coefficient of sound, and aerodynamic maximum phonation time. Analysis showed that healthy professional speakers demonstrated expanded vocal abilities in comparison to healthy non-professional speakers. Quantitative voice range profile parameters- pitch range, high frequency limit, area of high frequencies and coefficient of sound differed significantly between healthy professional and non-professional voices, and were more informative than speaking voice or aerodynamic parameters in showing the voice training. Logistic stepwise regression revealed that VRP area in high frequencies was sufficient to discriminate between healthy and dysphonic professional speakers for male subjects (overall discrimination accuracy--81.8%) and combination of three quantitative parameters (VRP high frequency limit, maximum voice intensity and slope of speaking curve) for female subjects (overall model discrimination accuracy--75.4%). We concluded that quantitative voice assessment

  5. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  6. A suite of simple models to support quantitative assessment of spread and impact in pest risk analysis – concepts and applications

    OpenAIRE

    Robinet, Christelle; Kehlenbeck, Hella; van der Werf, Wopke

    2011-01-01

    An assessment of the likelihood and extent of spread is an integral part of a pest risk analysis for quarantine measures. However, few tools - if any - are available to risk assessors to make an assessment of the spread as a dynamic process in space at the continental scale. Within the frame of the EU project PRATIQUE, we explored avenues for spread modelling and link models of spread to maps of host distribution, climate, and potential economic impacts. Five models for spread were conside...

  7. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 2. Performance, Emissions, and Cost of Combustion-Based NOx Controls for Wall and Tangential Furnace Coal-Fired Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Tran, Loan K. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 2 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  8. Quantitative analysis of protein phosphorylations and interactions by multi-colour IP-FCM as an input for kinetic modelling of signalling networks.

    Directory of Open Access Journals (Sweden)

    Sumit Deswal

    Full Text Available BACKGROUND: To understand complex biological signalling mechanisms, mathematical modelling of signal transduction pathways has been applied successfully in last few years. However, precise quantitative measurements of signal transduction events such as activation-dependent phosphorylation of proteins, remains one bottleneck to this success. METHODOLOGY/PRINCIPAL FINDINGS: We use multi-colour immunoprecipitation measured by flow cytometry (IP-FCM for studying signal transduction events to unrivalled precision. In this method, antibody-coupled latex beads capture the protein of interest from cellular lysates and are then stained with differently fluorescent-labelled antibodies to quantify the amount of the immunoprecipitated protein, of an interaction partner and of phosphorylation sites. The fluorescence signals are measured by FCM. Combining this procedure with beads containing defined amounts of a fluorophore allows retrieving absolute numbers of stained proteins, and not only relative values. Using IP-FCM we derived multidimensional data on the membrane-proximal T-cell antigen receptor (TCR-CD3 signalling network, including the recruitment of the kinase ZAP70 to the TCR-CD3 and subsequent ZAP70 activation by phosphorylation in the murine T-cell hybridoma and primary murine T cells. Counter-intuitively, these data showed that cell stimulation by pervanadate led to a transient decrease of the phospho-ZAP70/ZAP70 ratio at the TCR. A mechanistic mathematical model of the underlying processes demonstrated that an initial massive recruitment of non-phosphorylated ZAP70 was responsible for this behaviour. Further, the model predicted a temporal order of multisite phosphorylation of ZAP70 (with Y319 phosphorylation preceding phosphorylation at Y493 that we subsequently verified experimentally. CONCLUSIONS/SIGNIFICANCE: The quantitative data sets generated by IP-FCM are one order of magnitude more precise than Western blot data. This accuracy allowed

  9. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    )-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...... proteins in beef. The majority of the identified phosphoproteins were glycometabolism related enzymes in the sarcoplasmic fraction and contraction related proteins in the myofibrillar fraction. Subsequently, the quantitative LC-MS/MS-based phosphoproteomic strategy was used to identify and quantify...

  10. A quantitative analysis of hip capsular thickness.

    Science.gov (United States)

    Philippon, Marc J; Michalski, Max P; Campbell, Kevin J; Rasmussen, Matthew T; Goldsmith, Mary T; Devitt, Brian M; Wijdicks, Coen A; LaPrade, Robert F

    2015-09-01

    The purpose of this study was to provide a comprehensive quantitative analysis of capsular thickness adjacent to the acetabular rim in clinically relevant locations. Dissections were performed and hip capsular measurements were recorded on 13 non-paired, fresh-frozen cadaveric hemi-pelvises using a coordinate measuring device. Measurements were taken for each clock-face position at 0, 5, 10 and 15 mm distances from the labral edge. The capsule was consistently thickest at 2 o'clock for each interval from the labrum with a maximum thickness of 8.3 at 10 mm [95 % CI 6.8, 9.8] and 15 mm [95 % CI 6.8, 9.7]. The capsule was noticeably thinner between 4 and 11 o'clock with a minimum thickness of 4.1 mm [95 % CI 3.3, 4.9] at 10 o'clock at the labral edge. Direct comparison between 0 and 5 mm between 9 and 3 o'clock showed that the hip capsule was significantly thicker at 5 mm from the labrum at 9 o'clock (p = 0.027), 10 o'clock (p = 0.032), 1 o'clock (p = 0.003), 2 o'clock (p = 0.001) and 3 o'clock (p = 0.001). The hip capsule was thickest between the 1 and 2 o'clock positions for all measured distances from the acetabular labrum and reached its maximum thickness at 2 o'clock, which corresponds to the location of the iliofemoral ligament.

  11. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...... time-to-event characteristic of interest. Real genetic longevity studies based on female animals of different species (sows, dairy cows, and sheep) exemplifies the use of the methods. Moreover these studies allow to understand som genetic mechanisms related to the lenght of the productive life...

  12. Chromatin immunoprecipitation: optimization, quantitative analysis and data normalization

    Directory of Open Access Journals (Sweden)

    Peterhansel Christoph

    2007-09-01

    Full Text Available Abstract Background Chromatin remodeling, histone modifications and other chromatin-related processes play a crucial role in gene regulation. A very useful technique to study these processes is chromatin immunoprecipitation (ChIP. ChIP is widely used for a few model systems, including Arabidopsis, but establishment of the technique for other organisms is still remarkably challenging. Furthermore, quantitative analysis of the precipitated material and normalization of the data is often underestimated, negatively affecting data quality. Results We developed a robust ChIP protocol, using maize (Zea mays as a model system, and present a general strategy to systematically optimize this protocol for any type of tissue. We propose endogenous controls for active and for repressed chromatin, and discuss various other controls that are essential for successful ChIP experiments. We experienced that the use of quantitative PCR (QPCR is crucial for obtaining high quality ChIP data and we explain why. The method of data normalization has a major impact on the quality of ChIP analyses. Therefore, we analyzed different normalization strategies, resulting in a thorough discussion of the advantages and drawbacks of the various approaches. Conclusion Here we provide a robust ChIP protocol and strategy to optimize the protocol for any type of tissue; we argue that quantitative real-time PCR (QPCR is the best method to analyze the precipitates, and present comprehensive insights into data normalization.

  13. Qualitative and quantitative analysis of the students’ perceptions to the use of 3D electronic models in problem-based learning

    Directory of Open Access Journals (Sweden)

    Hai Ming Wong

    2017-06-01

    Full Text Available Faculty of Dentistry of the University of Hong Kong has introduced innovative blended problem-based learning (PBL with the aid of 3D electronic models (e-models to Bachelor of Dental Surgery (BDS curriculum. Statistical results of pre- and post-semester questionnaire surveys illustrated compatibility of e-models in PBL settings. The students’ importance ratings of two objectives “Complete assigned tasks on time” and “Active listener”, and twenty-two facilitator evaluation items including critical thinking and group problem-solving skills had increased significantly. The students’ PBL preparation behavior, attentions to problem understanding, problem analysis, and learning resource quality were also found to be related to online support of e-models and its software. Qualitative analysis of open-ended questions with visual text analytic software “Leximancer” improved validity of statistical results. Using e-model functions in treatment planning, problem analysis and giving instructions provided a method of informative communication. Therefore, it is critical for the faculty to continuously provide facilitator training and quality online e-model resources to the students.

  14. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  15. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  16. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  17. Quantitative Analysis of Radar Returns from Insects

    Science.gov (United States)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  18. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  19. Application of Fuzzy Set Theory to Quantitative Analysis of Correctness of the Mathematical Model Based on the ADI Method during Solidification

    Directory of Open Access Journals (Sweden)

    Xiaofeng Niu

    2013-01-01

    Full Text Available The explicit finite difference (EFD method is used to calculate the casting temperature field during the solidification process. Because of its limited time step, the computational efficiency of the EFD method is lower than that of the alternating direction implicit (ADI method. A model based on the equivalent specific heat method and the ADI method that improves computational efficiency is established. The error of temperature field simulation comes from model simplification, the acceptable hypotheses and calculation errors caused by different time steps, and the different mesh numbers that are involved in the process of numerical simulation. This paper quantitatively analyzes the degree of similarity between simulated and experimental results by the hamming distance (HD. For a thick-walled position, the time step influences the simulation results of the temperature field and the number of casting meshes has little influence on the simulation results of temperature field. For a thin-walled position, the time step has minimal influence on the simulation results of the temperature field and the number of casting meshes has a larger influence on the simulation results of temperature field.

  20. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  1. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  2. Single-cell-type quantitative proteomic and ionomic analysis of epidermal bladder cells from the halophyte model plant Mesembryanthemum crystallinum to identify salt-responsive proteins.

    Science.gov (United States)

    Barkla, Bronwyn J; Vera-Estrella, Rosario; Raymond, Carolyn

    2016-05-10

    Epidermal bladder cells (EBC) are large single-celled, specialized, and modified trichomes found on the aerial parts of the halophyte Mesembryanthemum crystallinum. Recent development of a simple but high throughput technique to extract the contents from these cells has provided an opportunity to conduct detailed single-cell-type analyses of their molecular characteristics at high resolution to gain insight into the role of these cells in the salt tolerance of the plant. In this study, we carry out large-scale complementary quantitative proteomic studies using both a label (DIGE) and label-free (GeLC-MS) approach to identify salt-responsive proteins in the EBC extract. Additionally we perform an ionomics analysis (ICP-MS) to follow changes in the amounts of 27 different elements. Using these methods, we were able to identify 54 proteins and nine elements that showed statistically significant changes in the EBC from salt-treated plants. GO enrichment analysis identified a large number of transport proteins but also proteins involved in photosynthesis, primary metabolism and Crassulacean acid metabolism (CAM). Validation of results by western blot, confocal microscopy and enzyme analysis helped to strengthen findings and further our understanding into the role of these specialized cells. As expected EBC accumulated large quantities of sodium, however, the most abundant element was chloride suggesting the sequestration of this ion into the EBC vacuole is just as important for salt tolerance. This single-cell type omics approach shows that epidermal bladder cells of M. crystallinum are metabolically active modified trichomes, with primary metabolism supporting cell growth, ion accumulation, compatible solute synthesis and CAM. Data are available via ProteomeXchange with identifier PXD004045.

  3. Quantitative analysis of comparative genomic hybridization

    Energy Technology Data Exchange (ETDEWEB)

    Manoir, S. du; Bentz, M.; Joos, S. [Abteilung Organisation komplexer Genome, Heidelberg (Germany)]|[Institut fuer Humangenetik, Heidelberg (Germany)] [and others

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a program for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.

  4. Quantitative option analysis for implementation and management of landfills.

    Science.gov (United States)

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  5. Qualitative and quantitative high performance thin layer chromatography analysis of Calendula officinalis using high resolution plate imaging and artificial neural network data modelling.

    Science.gov (United States)

    Agatonovic-Kustrin, S; Loescher, Christine M

    2013-10-10

    Calendula officinalis, commonly known Marigold, has been traditionally used for its anti-inflammatory effects. The aim of this study was to investigate the capacity of an artificial neural network (ANN) to analyse thin layer chromatography (TLC) chromatograms as fingerprint patterns for quantitative estimation of chlorogenic acid, caffeic acid and rutin in Calendula plant extracts. By applying samples with different weight ratios of marker compounds to the system, a database of chromatograms was constructed. A hundred and one signal intensities in each of the HPTLC chromatograms were correlated to the amounts of applied chlorogenic acid, caffeic acid, and rutin using an ANN. The developed ANN correlation was used to quantify the amounts of 3 marker compounds in calendula plant extracts. The minimum quantifiable level (MQL) of 610, 190 and 940 ng and the limit of detection (LD) of 183, 57 and 282 ng were established for chlorogenic, caffeic acid and rutin, respectively. A novel method for quality control of herbal products, based on HPTLC separation, high resolution digital plate imaging and ANN data analysis has been developed. The proposed method can be adopted for routine evaluation of the phytochemical variability in calendula extracts. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  7. Quantitative Analysis on the Energy and Environmental Impact of the Korean National Energy R&D Roadmap a Using Bottom-Up Energy System Model

    Directory of Open Access Journals (Sweden)

    Sang Jin Choi

    2017-03-01

    Full Text Available According to the Paris Agreement at the 21st Conference of the Parties, 196 member states are obliged to submit their Intended Nationally Determined Contributions (INDC for every 5 years. As a member, South Korea has already proposed the reduction target and need to submit the achievement as a result of the policies and endeavors in the near future. In this paper, a Korean bottom-up energy system model to support the low-carbon national energy R&D roadmap will be introduced and through the modeling of various scenarios, the mid-to long-term impact on energy consumptions and CO2 emissions will be analyzed as well. The results of the analysis showed that, assuming R&D investments for the 11 types of technologies, savings of 13.7% with regards to final energy consumptions compared to the baseline scenario would be feasible by 2050. Furthermore, in the field of power generation, the generation proportion of new and renewable energy is expected to increase from 3.0% as of 2011 to 19.4% by 2050. This research also suggested that the analysis on the Energy Technology R&D Roadmap based on the model can be used not only for overall impact analysis and R&D portfolio establishment, but also for the development of detailed R&D strategies.

  8. A route to quantitative 13C NMR analysis of multicomponent polyesters

    DEFF Research Database (Denmark)

    Hvilsted, S.

    1991-01-01

    A protocol for quantitative sequential 13C NMR analysis is developed for polyesters composed of trimethylol propane (TMP), neopentyl glycol (NPG), and adipic and isophthalic acids. TMP centred, structural models with methyl adipate and isophthalate branches in all possible combinations are synthe......A protocol for quantitative sequential 13C NMR analysis is developed for polyesters composed of trimethylol propane (TMP), neopentyl glycol (NPG), and adipic and isophthalic acids. TMP centred, structural models with methyl adipate and isophthalate branches in all possible combinations...

  9. Quantitative analysis of intermolecular interactions in 2,2'-((4 ...

    Indian Academy of Sciences (India)

    SUBBIAH THAMOTHARAN

    2018-02-07

    Feb 7, 2018 ... REGULAR ARTICLE. Quantitative analysis of intermolecular interactions in ... The quantitative molecular electrostatic potential surface diagram depicts the potential binding sites which are in good agreement with the crystal ..... root mean squared deviation (rmsd: 0.19 Å) is observed for the compound I and ...

  10. A Quantitative Software Risk Assessment Model

    Science.gov (United States)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  11. Quantitative analysis of solid samples using modified specular reflectance accessory.

    Science.gov (United States)

    Czaja, Tomasz; Mazurek, Sylwester; Szostak, Roman

    2016-12-01

    Diffuse reflectance Fourier transform infrared spectroscopy (DRIFTS) is a fast, reliable and cost effective analytical method, requiring minimal or no sample preparation. It is commonly used in the course of qualitative and quantitative analysis of pharmaceutical ingredients and food. We demonstrate that simpler and cheaper specular reflectance (SR) accessory working in a DRIFTS like mode (SR-DL) can be an alternative for DIRFTS attachment. An application of a modified SR accessory for quantitative analysis of solids samples is presented. As a case study the concentration of cinnarizine in commercial tablets has been determined from DRIFTS and SR-DL infrared (IR) and near-infrared (NIR) spectra recorded using DTGS (deuterated triglicine sulphate) detector in the IR and NIR regions and InGaAs (indium-gallium arsenide) detector in the NIR range. Based on these spectra Partial Least Squares (PLS) models were constructed and relative standard errors of prediction (RSEP) were calculated for the calibration, validation and analysed data sets. They amounted to 2.4-2.5%, 2.1-2.7% and 2.0-2.6% for the DRIFTS attachment while 2.1-2.2%, 2.0-2.3% and 1.9-2.6%, respectively, for the modified SR accessory. Obtained error values indicate that modified SR accessory can be effectively used for quantification of solid pharmaceutical samples in the mid- and near-infrared regions. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Porosity determination on pyrocarbon using automatic quantitative image analysis

    International Nuclear Information System (INIS)

    Koizlik, K.; Uhlenbruck, U.; Delle, W.; Nickel, H.

    Methods of porosity determination are reviewed and applied to the measurement of the porosity of pyrocarbon. Specifically, the mathematical basis of stereology and the procedures involved in quantitative image analysis are detailed

  13. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  14. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction.

    Science.gov (United States)

    Cobbs, Gary

    2012-08-16

    Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the literature. They also give better estimates of

  15. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  16. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  17. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  18. Use of MRI in Differentiation of Papillary Renal Cell Carcinoma Subtypes: Qualitative and Quantitative Analysis.

    Science.gov (United States)

    Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh

    2016-03-01

    The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.

  19. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  20. Quantitative image analysis of WE43-T6 cracking behavior

    International Nuclear Information System (INIS)

    Ahmad, A; Yahya, Z

    2013-01-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  1. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  2. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  3. Quantitative gas analysis with FT-IR

    DEFF Research Database (Denmark)

    Bak, J.; Larsen, A.

    1995-01-01

    ) was used to model the CO calibration spectra in order to improve the sensitivity and to flag possible outliers in the prediction step. The relation between the absorbance values and concentrations was strongly nonlinear. This result was caused mainly by the low spectral resolution of the instrument....... To improve the model predictions, we have linearized the data prior to making the model calculations. The linearization scheme presented here simplified the data pretreatment, because the function needed to linearize the data might be approximated by co-absorbance peak areas representing the concentrations....... The integrated absorbance areas, rather than the concentration values, were used as input to the PLS algorithm. A fifth-order polynomial was used to calculate the concentrations from the predicted absorbance areas. The PLS algorithm used on the linearized data reduced the number of factors in the calibration...

  4. 5-aminolevulinic acid induced protoporphyrin IX as a fluorescence marker for quantitative image analysis of high-grade dysplasia in Barrett's esophagus cellular models

    Science.gov (United States)

    Yeh, Shu-Chi Allison; Sahli, Samir; Andrews, David W.; Patterson, Michael S.; Armstrong, David; Provias, John; Fang, Qiyin

    2015-03-01

    Early detection and treatment of high-grade dysplasia (HGD) in Barrett's esophagus may reduce the risk of developing esophageal adenocarcinoma. Confocal endomicroscopy (CLE) has shown advantages over routine white-light endoscopic surveillance with biopsy for histological examination; however, CLE is compromised by insufficient contrast and by intra- and interobserver variation. An FDA-approved PDT photosensitizer was used here to reveal morphological and textural features similar to those found in histological analysis. Support vector machines were trained using the aforementioned features to obtain an automatic and robust detection of HGD. Our results showed 95% sensitivity and 87% specificity using the optimal feature combination and demonstrated the potential for extension to a three-dimensional cell model.

  5. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  6. Quantitative analysis of wellbeing and personal goals

    NARCIS (Netherlands)

    van der Laan, Jorien

    2016-01-01

    In this paper we present data on 407 homeless adults who have just entered the Dutch social relief system. We examined their personal goals of homeless adults and the association between their perceived goal related self-efficacy and their quality of life. Based on a hierarchical regression analysis

  7. Fundamentals of quantitative PET data analysis

    NARCIS (Netherlands)

    Willemsen, ATM; van den Hoff, J

    2002-01-01

    Drug analysis and development with PET should fully exhaust the ability of this tomographic technique to quantify regional tracer concentrations in vivo. Data evaluation based on visual inspection or assessment of regional image contrast is not sufficient for this purpose since much of the

  8. Quantitative analysis of thermal insulation coatings

    DEFF Research Database (Denmark)

    Kiil, Søren

    2014-01-01

    This work concerns the development of simulation tools for mapping of insulation properties of thermal insulation coatings based on selected functional filler materials. A mathematical model, which includes the underlying physics (i.e. thermal conductivity of a heterogeneous two-component coating...

  9. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    OpenAIRE

    Cobbs, Gary

    2012-01-01

    Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most pote...

  10. A Quantitative Analysis of Countries’ Research Strengths

    Directory of Open Access Journals (Sweden)

    Anurag Saxena

    2009-05-01

    Full Text Available This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding. Research output from 34 countries is examined using a conceptual framework that emphasizes the ratio of research resources devoted to a particular field to research output measured by publications in peer-reviewed journals. Using cluster analysis and k-means analysis, we conclude that countries’ research output (as measured by the number of published peer-reviewed articles and their efficiency (as measured by a ratio of research output to dollars allocated to research together indicate a comparative advantage within any given country’s own menu of research choices and an absolute advantage relative to other countries. This study implies that the more countries engage in publication in areas of relative strength and consume research in areas of relative weakness, the stronger their entire research agenda will become.

  11. A Quantitative Risk Analysis Method for the High Hazard Mechanical System in Petroleum and Petrochemical Industry

    Directory of Open Access Journals (Sweden)

    Yang Tang

    2017-12-01

    Full Text Available The high hazard mechanical system (HHMS has three characteristics in the petroleum and petrochemical industry (PPI: high risk, high cost, and high technology requirements. For a HHMS, part, component, and subsystem failures will result in varying degrees and various types of risk consequences, including unexpected downtime, production losses, economic costs, safety accidents, and environmental pollution. Thus, obtaining the quantitative risk level and distribution in a HHMS to control major risk accidents and ensure safe production is of vital importance. However, the structure of the HHMS is more complex than some other systems, making the quantitative risk analysis process more difficult. Additionally, a variety of uncertain risk data hinder the realization of quantitative risk analysis. A few quantitative risk analysis techniques and studies for HHMS exist, especially in the PPI. Therefore, a study on the quantitative risk analysis method for HHMS was completed to obtain the risk level and distribution of high-risk objects. Firstly, Fuzzy Set Theory (FST was applied to address the uncertain risk data for the occurrence probability (OP and consequence severity (CS in the risk analysis process. Secondly, a fuzzy fault tree analysis (FFTA and a fuzzy event tree analysis (FETA were used to achieve quantitative risk analysis and calculation. Thirdly, a fuzzy bow-tie model (FBTM was established to obtain a quantitative risk assessment result according to the analysis results of the FFTA and FETA. Finally, the feasibility and practicability of the method were verified with a case study on the quantitative risk analysis of one reciprocating pump system (RPS. The quantitative risk analysis method for HHMS can provide more accurate and scientific data support for the development of Asset Integrity Management (AIM systems in the PPI.

  12. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  13. Analysis of methods for quantitative renography

    International Nuclear Information System (INIS)

    Archambaud, F.; Maksud, P.; Prigent, A.; Perrin-Fayolle, O.

    1995-01-01

    This article reviews the main methods using renography to estimate renal perfusion indices and to quantify differential and global renal function. The review addresses the pathophysiological significance of estimated parameters according to the underlying models and the choice of the radiopharmaceutical. The dependence of these parameters on the region of interest characteristics and on the methods of background and attenuation corrections are surveyed. Some current recommendations are proposed. (authors). 66 refs., 8 figs

  14. Quantitative Analysis of Bone Microstructure Using Tomosynthesis

    Science.gov (United States)

    2013-10-01

    preliminary wedge loader model, a simplified construct was fabricated which allowed combined loading via a lever arm with an offset hinge. Final...Figure 25). In this construct, a specimen-specific total displacement was applied at the top surface of a flat beam lever at a specimen-specific...Contribution SAMPLE - Dr. John Smith, Professor of Biology, Sample University, Assisted with the collection of data Dr. Yener Yeni, Head of Biomechanics

  15. Quantitative analysis of binding sites for 9-fluoropropyl-(+)-dihydrotetrabenazine ([¹⁸F]AV-133) in a MPTP-lesioned PD mouse model.

    Science.gov (United States)

    Chao, Ko-Ting; Tsao, Hsin-Hsin; Weng, Yi-Hsin; Hsiao, Ing-Tsung; Hsieh, Chia-Ju; Wey, Shiaw-Pyng; Yen, Tzu-Chen; Kung, Mei-Ping; Lin, Kun-Ju

    2012-09-01

    [¹⁸F]AV-133 is a novel PET tracer for targeting the vesicular monoamine transporter 2 (VMAT2). The aim of this study is to characterize and quantify the loss of monoamine neurons with [¹⁸F]AV-133 in the MPTP-lesioned PD mouse model using animal PET imaging and ex vivo quantitative autoradiography (QARG). Optimal imaging time window of [¹⁸F]AV-133 was first determined in normal C57BL/6 mice (n = 3) with a 90-min dynamic scan. The reproducibility of [¹⁸F]AV-133 PET imaging was evaluated by performing a test-retest study within 1 week for the normal group (n = 6). For MPTP-lesioned studies, normal, and MPTP-treated [25 mg mg/kg once (Group A) and twice (Group B), respectively, daily for 5 days, i.p., groups of four normal and MPTP-treated] mice were used. PET imaging studies at baseline and at Day 4 post-MPTP injections were performed at the optimal time window after injection of 11.1 MBq [¹⁸F]AV-133. Specific uptake ratio (SUr) of [¹⁸F]AV-133 was calculated by [(target uptake-cerebellar uptake)/cerebellar uptake] with cerebellum as the reference region. Ex vitro QARG and immunohistochemistry (IHC) studies with tyrosine hydroxylase antibody were carried out to confirm the abundance of dopaminergic neurons. The variability between [¹⁸F]AV-133 test-retest striatal SUr was 6.60 ± 3.61% with less than 5% standard deviation between animals (intervariability). The percentages of MPTP lesions were Group A 0.94 ± 0.29, -42.1% and Group B 0.65 ± 0.09, -60.4%. By QARG, specific binding of [¹⁸F]AV-133 was reduced relative to the control groups by 50.6% and 60.7% in striatum and by 30.6% and 46.4% in substantia nigra (Groups A and B, respectively). Relatively small [¹⁸F]AV-133 SUr decline was noted in the serotonin and norepinephrine-enriched regions (7.9% and 9.4% in mid-brain). Results obtained from IHC consistently confirmed the sensitivity and selectivity of dopaminergic neuron loss after MPTP treatment. [¹⁸F]AV-133 PET SUr displayed a high

  16. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  17. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  18. Quantitative studies of rhubarb using quantitative analysis of multicomponents by single marker and response surface methodology.

    Science.gov (United States)

    Sun, Jiachen; Wu, Yueting; Dong, Shengjie; Li, Xia; Gao, Wenyuan

    2017-10-01

    In this work, we developed a novel approach to evaluate the contents of bioactive components in rhubarb. The present method was based on the quantitative analysis of multicomponents by a single-marker and response surface methodology approaches. The quantitative analysis of multicomponents by a single-marker method based on high-performance liquid chromatography coupled with photodiode array detection was developed and applied to determine the contents of 12 bioactive components in rhubarb. No significant differences were found in the results from the quantitative analysis of multicomponents by a single-marker and the external standard method. In order to maximize the extraction of 12 bioactive compounds in rhubarb, the ultrasonic-assisted extraction conditions were obtained by the response surface methodology coupled with Box-Behnken design. According to the obtained results, we showed that the optimal conditions would be as follows: proportion of ethanol/water 74.39%, solvent-to-solid ratio 24.07:1 v/w, extraction time 51.13 min, and extraction temperature 63.61°C. The analytical scheme established in this research should be a reliable, convenient, and appropriate method for quantitative determination of bioactive compounds in rhubarb. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Quantitation and gompertzian analysis of tumor growth

    DEFF Research Database (Denmark)

    Rygaard, K; Spang-Thomsen, M

    1998-01-01

    to transform the experimental data into useful growth curves. A transformed Gompertz function is used as the basis for calculating relevant parameters pertaining to tumor growth and response to therapy. The calculations are facilitated by use of a computer program which performs the necessary calculations......Human tumor xenografts in immune-deficient animals are used to establish tumor growth curves and for studying the effect of experimental therapy on tumor growth. In this review we describe a method for making serial measurements of tumor size in the nude mouse model as well as methods used...

  20. Quantitation and gompertzian analysis of tumor growth

    DEFF Research Database (Denmark)

    Rygaard, K; Spang-Thomsen, M

    1998-01-01

    Human tumor xenografts in immune-deficient animals are used to establish tumor growth curves and for studying the effect of experimental therapy on tumor growth. In this review we describe a method for making serial measurements of tumor size in the nude mouse model as well as methods used...... to transform the experimental data into useful growth curves. A transformed Gompertz function is used as the basis for calculating relevant parameters pertaining to tumor growth and response to therapy. The calculations are facilitated by use of a computer program which performs the necessary calculations...

  1. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  2. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  3. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  4. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  5. ASSETS MANAGEMENT - A CONCEPTUAL MODEL DECOMPOSING VALUE FOR THE CUSTOMER AND A QUANTITATIVE MODEL

    Directory of Open Access Journals (Sweden)

    Susana Nicola

    2015-03-01

    Full Text Available In this paper we describe de application of a modeling framework, the so-called Conceptual Model Decomposing Value for the Customer (CMDVC, in a Footwear Industry case study, to ascertain the usefulness of this approach. The value networks were used to identify the participants, both tangible and intangible deliverables/endogenous and exogenous assets, and the analysis of their interactions as the indication for an adequate value proposition. The quantitative model of benefits and sacrifices, using the Fuzzy AHP method, enables the discussion of how the CMDVC can be applied and used in the enterprise environment and provided new relevant relations between perceived benefits (PBs.

  6. [TXRF technique and quantitative analysis of mollusc teeth].

    Science.gov (United States)

    Tian, Y; Liu, K; Wu, X; Zheng, S

    1999-06-01

    Total reflection X-ray fluorescence (TXRF) analysis technique and the instrument with a short path, high efficiency, low power and small volume are briefly presented. The detection limit of the system are at pg-level for Cu and Mo target excitation. Teeth of a marine mollusc were measured quantitatively and the spectrum and analysis results were given.

  7. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  8. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    Lefevre, Chantal.

    1979-11-01

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples [fr

  9. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  10. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  11. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  12. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  13. Quantitative Chemical Analysis of Enceladus' Plume Composition.

    Science.gov (United States)

    Peter, J.; Nordheim, T.; Hofmann, A.; Hand, K. P.

    2017-12-01

    Analyses of data from Cassini's Ion and Neutral Mass Spectrometer (INMS) taken during several close flybys of Enceladus suggest the presence of a potentially habitable ocean underneath the ice shell [1,2]. Proper identification of the molecular species sampled from Enceladus' plumes by INMS is of utmost importance for characterizing the ocean's chemical composition. Data from Cassini's Cosmic Dust Analyzer (CDA) and Visible and Infrared Mapping Spectrometer (VIMS) have provided clues for possible plume chemistry, but further analysis of the INMS data is necessary [3,4]. Here we present a novel automated algorithm for comparing INMS spectra and analogue laboratory spectra to a vast library of sample spectra provided by the National Institute of Standards and Technology (NIST). The algorithm implements a Monte Carlo simulation that computes the angular similarity between the spectrum of interest and a random sample of synthetic spectra generated at arbitrary mixing ratios of molecular species. The synthetic spectra with the highest similarity scores are then averaged to produce a convergent estimate of the mixing ratio of the spectrum of interest. Here we will discuss the application of this technique to INMS and laboratory data and the implication of our preliminary results for the ocean chemistry and habitability of Enceladus. 1. Waite, J., et al., 2009. Liquid Water on Enceladus From Observations of Ammonia and 40Ar in the Plume. Nature 460, 487-498. 2. Waite, J., et al. 2017. Cassini Finds Molecular Hydrogen in the Enceladus Plume: Evidence for Hydrothermal Processes. Science 356, 155-159. 3. Postberg, F., et al., 2008. The E Ring in the Vicinity of Enceladus II: Signatures of Enceladus in the Elemental Composition of E-Ring Particles. Icarus 193(2), 438-454. 4. Brown, R., et al., 2006. Composition and Physical Properties of Enceladus' Surface. Science 311, 1425-1428.

  14. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  15. The correlation of contrast-enhanced ultrasound and MRI perfusion quantitative analysis in rabbit VX2 liver cancer.

    Science.gov (United States)

    Xiang, Zhiming; Liang, Qianwen; Liang, Changhong; Zhong, Guimian

    2014-12-01

    Our objective is to explore the value of liver cancer contrast-enhanced ultrasound (CEUS) and MRI perfusion quantitative analysis in liver cancer and the correlation between these two analysis methods. Rabbit VX2 liver cancer model was established in this study. CEUS was applied. Sono Vue was applied in rabbits by ear vein to dynamically observe and record the blood perfusion and changes in the process of VX2 liver cancer and surrounding tissue. MRI perfusion quantitative analysis was used to analyze the mean enhancement time and change law of maximal slope increasing, which were further compared with the pathological examination results. Quantitative indicators of liver cancer CEUS and MRI perfusion quantitative analysis were compared, and the correlation between them was analyzed by correlation analysis. Rabbit VX2 liver cancer model was successfully established. CEUS showed that time-intensity curve of rabbit VX2 liver cancer showed "fast in, fast out" model while MRI perfusion quantitative analysis showed that quantitative parameter MTE of tumor tissue increased and MSI decreased: the difference was statistically significant (P quantitative analysis were not significantly different (P > 0.05). However, the quantitative parameter of them were significantly positively correlated (P quantitative analysis can both dynamically monitor the liver cancer lesion and surrounding liver parenchyma, and the quantitative parameters of them are correlated. The combined application of both is of importance in early diagnosis of liver cancer.

  16. Generalized PSF modeling for optimized quantitation in PET imaging

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-ud-Din, Hassan; Karakatsanis, Nicolas A.; Jha, Abhinav K.; Casey, Michael E.; Kadrmas, Dan J.; Rahmim, Arman

    2017-06-01

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  17. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  18. Quantitative Proteomic and Phosphoproteomic Analysis of Trypanosoma cruzi Amastigogenesis

    DEFF Research Database (Denmark)

    Queiroz, Rayner M L; Charneau, Sebastien; Mandacaru, Samuel C

    2014-01-01

    this well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of the T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, i......TRAQ-labeled and multiplexed. Subsequently, phosphopeptides were enriched using TiO2 matrix. Non-phosphorylated peptides were HILIC-fractionated prior to LC-MS/MS analysis. LC-MS/MS and bioinformatics procedures were used for protein and phosphopeptide quantitation, identification and phosphorylation site assignment. We could...... induced by incubation in acidic medium were also evinced. To our knowledge, this work is the most comprehensive quantitative proteomics study of the T. cruzi amastigogenesis and this data will provide trustworthy basis for future studies and possibly for new potential drug targets....

  19. Alternative Equation on Magnetic Pair Distribution Function for Quantitative Analysis

    Science.gov (United States)

    Kodama, Katsuaki; Ikeda, Kazutaka; Shamoto, Shin-ichi; Otomo, Toshiya

    2017-12-01

    We derive an alternative equation of magnetic pair distribution function (mPDF) related to the mPDF equation given in a preceding study [B. A. Frandsen, X. Yang, and S. J. L. Billinge, https://doi.org/10.1107/S2053273313033081" xlink:type="simple">Acta Crystallogr., Sect. A 70, 3 (2014)] for quantitative analysis of realistic experimental data. The additional term related to spontaneous magnetization included in the equation is particularly important for the mPDF analysis of ferromagnetic materials. Quantitative estimation of mPDF from neutron diffraction data is also shown. The experimental mPDFs estimated from the neutron diffraction data of the ferromagnet MnSb and the antiferromagnet MnF2 are quantitatively consistent with the mPDFs calculated using the presented equation.

  20. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Directory of Open Access Journals (Sweden)

    Sebastian K Wandinger

    Full Text Available The four members of the epidermal growth factor receptor (EGFR/ERBB family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1 treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  1. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    Science.gov (United States)

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  2. Correlation and path coefficient analysis of some quantitative traits ...

    African Journals Online (AJOL)

    Thirty-seven wheat genotypes and three check varieties were studied for correlation and path coefficient analysis of some quantitative traits in wheat at Kisan (P.G), College, Simbhaoli in India. Generally, the estimates of genotypic correlation coefficients were higher than the corresponding phenotypic correlation coefficients ...

  3. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    Disciplinary Journal, Ethiopia. Vol. 4 (3a) July, 2010. ISSN 1994-9057 (Print). ISSN 2070-0083 (Online). Quantitative Analysis of Complex Tropical Forest. Stands: A Review (Pp. 367-377). Oyebade, B. A. - Forest Biometrics & Measurement Unit, ...

  4. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  5. Quantitative analysis of microtubule transport in growing nerve processes

    DEFF Research Database (Denmark)

    Ma*, Ytao; Shakiryanova*, Dinara; Vardya, Irina

    2004-01-01

    the translocation of MT plus ends in the axonal shaft by expressing GFP-EB1 in Xenopus embryo neurons in culture. Formal quantitative analysis of MT assembly/disassembly indicated that none of the MTs in the axonal shaft were rapidly transported. Our results suggest that transport of axonal MTs is not required...

  6. Insights Into Quantitative Biology: analysis of cellular adaptation

    OpenAIRE

    Agoni, Valentina

    2013-01-01

    In the last years many powerful techniques have emerged to measure protein interactions as well as gene expression. Many progresses have been done since the introduction of these techniques but not toward quantitative analysis of data. In this paper we show how to study cellular adaptation and how to detect cellular subpopulations. Moreover we go deeper in analyzing signal transduction pathways dynamics.

  7. Qualitative and quantitative analysis of catechin and quercetin in ...

    African Journals Online (AJOL)

    Purpose: To perform a qualitative and quantitative analysis of catechin and quercetin in flavonoids extracted from Rosa roxburghii Tratt. Methods: Total flavonoids were determined using ultraviolet spectrophotometry (UV) at 500 nm. The optimal gradient program started with 15 % methanol and was kept within a period of 0 ...

  8. Quantitative analysis of intermolecular interactions in 2, 2'-((4 ...

    Indian Academy of Sciences (India)

    The relative contributions of various intermolecular contacts in the title compound and its closely related analogs are evaluated using Hirshfeld surface analysis and the decomposed fingerprint plots. The common packing features exist between the title compound and its related analogs are identified.The quantitative ...

  9. Quantitative modelling in design and operation of food supply systems

    NARCIS (Netherlands)

    Beek, van P.

    2004-01-01

    During the last two decades food supply systems not only got interest of food technologists but also from the field of Operations Research and Management Science. Operations Research (OR) is concerned with quantitative modelling and can be used to get insight into the optimal configuration and

  10. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  11. Quantitative Analysis of Complex Drug-Drug Interactions Between Repaglinide and Cyclosporin A/Gemfibrozil Using Physiologically Based Pharmacokinetic Models With In Vitro Transporter/Enzyme Inhibition Data.

    Science.gov (United States)

    Kim, Soo-Jin; Toshimoto, Kota; Yao, Yoshiaki; Yoshikado, Takashi; Sugiyama, Yuichi

    2017-09-01

    Quantitative analysis of transporter- and enzyme-mediated complex drug-drug interactions (DDIs) is challenging. Repaglinide (RPG) is transported into the liver by OATP1B1 and then is metabolized by CYP2C8 and CYP3A4. The purpose of this study was to describe the complex DDIs of RPG quantitatively based on unified physiologically based pharmacokinetic (PBPK) models using in vitro K i values for OATP1B1, CYP3A4, and CYP2C8. Cyclosporin A (CsA) or gemfibrozil (GEM) increased the blood concentrations of RPG. The time profiles of RPG and the inhibitors were analyzed by PBPK models, considering the inhibition of OATP1B1 and CYP3A4 by CsA or OATP1B1 inhibition by GEM and its glucuronide and the mechanism-based inhibition of CYP2C8 by GEM glucuronide. RPG-CsA interaction was closely predicted using a reported in vitro K i,OATP1B1 value in the presence of CsA preincubation. RPG-GEM interaction was underestimated compared with observed data, but the simulation was improved with the increase of f m,CYP2C8 . These results based on in vitro K i values for transport and metabolism suggest the possibility of a bottom-up approach with in vitro inhibition data for the prediction of complex DDIs using unified PBPK models and in vitro f m value of a substrate for multiple enzymes should be considered carefully for the prediction. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. Quantitating the subtleties of microglial morphology with fractal analysis

    Science.gov (United States)

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  13. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  14. Improved method and apparatus for chromatographic quantitative analysis

    Science.gov (United States)

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  15. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  16. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  17. Quantitative Ultrasound and bone's response to exercise: a meta analysis.

    Science.gov (United States)

    Babatunde, O O; Forsyth, J J

    2013-03-01

    The utility of Quantitative Ultrasound (QUS) for assessing and monitoring changes in bone health due to exercise is limited for lack of adequate research evidence. Restrictions to bone density testing and the enduring debate over repeat dual energy absorptiometry testing spells uncertainty over clinical and non-clinical evaluation of exercise for prevention of osteoporosis. This study, via systematic review and meta-analysis, aimed to paint a portrait of current evidence regarding QUS' application to monitoring bone's adaptive response to exercise interventions. Structured and comprehensive search of databases was undertaken along with hand-searching of key journals and reference lists to locate relevant studies published up to December 2011. Twelve articles met predetermined inclusion criteria. The effect of exercise interventions for improving bone health, as measured by QUS of the calcaneum, was examined across the age spectrum. Study outcomes for analysis: absolute (dB/MHz) or relative change (%) in broadband ultrasound attenuation (BUA) and/or os calcis stiffness index were compared by calculating standardised mean difference (SMD) using fixed- and random-effects models. Quality of included trials varied from low to high on a scale of one to three. Four to 36months of exercise led to a significant improvement in calcaneum BUA (0.98 SMD, 95% CI 0.80, 1.16, overall effect Z-value=10.72, p=0.001) across the age spectrum. The meta-analysis attests to the sensitivity of QUS to exercise-induced changes in bone health across the age groups. QUS may be considered for use in exercise-based bone health interventions for preventing osteoporosis. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    Science.gov (United States)

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  19. Quantitative analysis of the neurotransmitters serotonin, dopamine, glutamate and γ-aminobutyric acid in the mouse model of obsessive compulsive disorder

    OpenAIRE

    Schaper, Helge Ascan

    2010-01-01

    Within this study a new genetic mouse model of obsessive compulsive disorder was further established. For this purpose 3 groups of mice were examined, which had been bidirectionally selected regarding thermoregulatory nest-building behavior before. One group emerged with increased nest-building behavior HA, another group with reduced nest-building behavior LA as well as a control group CA, which shows average values concerning nest-building behavior. Particularly the increased nest-building b...

  20. Quantitative analysis of chromatin accessibility in mouse embryonic fibroblasts.

    Science.gov (United States)

    Zhuo, Baowen; Yu, Juan; Chang, Luyuan; Lei, Jiafan; Wen, Zengqi; Liu, Cuifang; Mao, Guankun; Wang, Kehui; Shen, Jie; Xu, Xueqing

    2017-11-04

    Genomic DNA of eukaryotic cells is hierarchically packaged into chromatin by histones. The dynamic organization of chromatin fibers plays a critical role in the regulation of gene transcription and other DNA-associated biological processes. Recently, numerous approaches have been developed to map the chromatin organization by characterizing chromatin accessibilities in genome-wide. However, reliable methods to quantitatively map chromatin accessibility are not well-established, especially not on a genome-wide scale. Here, we developed a modified MNase-seq for mouse embryonic fibroblasts, wherein chromatin was partially digested at multiple digestion times using micrococcal nuclease (MNase), allowing quantitative analysis of local yet genome-wide chromatin compaction. Our results provide strong evidence that the chromatin accessibility at promoter regions are positively correlated with gene activity. In conclusion, our assay is an ideal tool for the quantitative study of gene regulation in the perspective of chromatin accessibility. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Quantitative and logic modelling of gene and molecular networks

    Science.gov (United States)

    Le Novère, Nicolas

    2015-01-01

    Behaviours of complex biomolecular systems are often irreducible to the elementary properties of their individual components. Explanatory and predictive mathematical models are therefore useful for fully understanding and precisely engineering cellular functions. The development and analyses of these models require their adaptation to the problems that need to be solved and the type and amount of available genetic or molecular data. Quantitative and logic modelling are among the main methods currently used to model molecular and gene networks. Each approach comes with inherent advantages and weaknesses. Recent developments show that hybrid approaches will become essential for further progress in synthetic biology and in the development of virtual organisms. PMID:25645874

  2. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  3. Quantitative comparisons of analogue models of brittle wedge dynamics

    Science.gov (United States)

    Schreurs, Guido

    2010-05-01

    Analogue model experiments are widely used to gain insights into the evolution of geological structures. In this study, we present a direct comparison of experimental results of 14 analogue modelling laboratories using prescribed set-ups. A quantitative analysis of the results will document the variability among models and will allow an appraisal of reproducibility and limits of interpretation. This has direct implications for comparisons between structures in analogue models and natural field examples. All laboratories used the same frictional analogue materials (quartz and corundum sand) and prescribed model-building techniques (sieving and levelling). Although each laboratory used its own experimental apparatus, the same type of self-adhesive foil was used to cover the base and all the walls of the experimental apparatus in order to guarantee identical boundary conditions (i.e. identical shear stresses at the base and walls). Three experimental set-ups using only brittle frictional materials were examined. In each of the three set-ups the model was shortened by a vertical wall, which moved with respect to the fixed base and the three remaining sidewalls. The minimum width of the model (dimension parallel to mobile wall) was also prescribed. In the first experimental set-up, a quartz sand wedge with a surface slope of ˜20° was pushed by a mobile wall. All models conformed to the critical taper theory, maintained a stable surface slope and did not show internal deformation. In the next two experimental set-ups, a horizontal sand pack consisting of alternating quartz sand and corundum sand layers was shortened from one side by the mobile wall. In one of the set-ups a thin rigid sheet covered part of the model base and was attached to the mobile wall (i.e. a basal velocity discontinuity distant from the mobile wall). In the other set-up a basal rigid sheet was absent and the basal velocity discontinuity was located at the mobile wall. In both types of experiments

  4. Qualitative and quantitative analysis of oropharyngeal swallowing in Down syndrome.

    Science.gov (United States)

    Sales, André Vinicius Marcondes Natel; Giacheti, Célia Maria; Cola, Paula Cristina; Silva, Roberta Gonçalves da

    2017-10-23

    To describe the qualitative and quantitative temporal analysis of oropharyngeal swallowing in children diagnosed with Down syndrome (DS) through a case series study of six individuals aged 4 to 17 months (mean age = 11.16 months; median = 12 months). Qualitative and quantitative temporal analysis of swallowing using videofluoroscopy and specific software. The following parameters were assessed: presence or absence of oral incoordination, labial sphincter sealing incompetence, oral residue, posterior oral spillage, laryngotracheal penetration and aspiration, pharyngeal and total oral transit time (TOTT). Qualitative analysis identified individuals with disorders in at least four of the swallowing parameters investigated. Only one individual presented total oral transit time (TOTT) different from the others. No difference was observed between the cases regarding pharyngeal transit time. Qualitative swallowing disorders are observed in children with DS, with difference in TOTT only in the case report of the youngest infant.

  5. A simple LC-MS/MS method for quantitative analysis of underivatized neurotransmitters in rats urine: assay development, validation and application in the CUMS rat model.

    Science.gov (United States)

    Zhai, Xue-jia; Chen, Fen; Zhu, Chao-ran; Lu, Yong-ning

    2015-11-01

    Many amino acid neurotransmitters in urine are associated with chronic stress as well as major depressive disorders. To better understand depression, an analytical LC-MS/MS method for the simultaneous determination of 11 underivatized neurotransmitters (4-aminohippurate, 5-HIAA, glutamate, glutamine, hippurate, pimelate, proline, tryptophan, tyramine, tyrosine and valine) in a single analytical run was developed. The advantage of this method is the simple preparation in that there is no need to deconjugate the urine samples. The quantification range was 25-12,800 ng mL(-1) with >85.8% recovery for all analytes. The nocturnal urine concentrations of the 11 neurotransmitters in chronic unpredictable mild stress (CUMS) model rats and control group (n = 12) were analyzed. A series of significant changes in urinary excretion of neurotransmitters could be detected: the urinary glutamate, glutamine, hippurate and tyramine concentrations were significantly lower in the CUMS group. In addition, the urinary concentrations of tryptophan as well as tyrosine were significantly higher in chronically stressed rats. This method allows the assessment of the neurotransmitters associated with CUMS in rat urine in a single analytical run, making it suitable for implementation as a routine technique in depression research. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  7. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  8. Quantitative analysis and comparison study of [18F]AlF-NOTA-PRGD2, [18F]FPPRGD2 and [68Ga]Ga-NOTA-PRGD2 using a reference tissue model.

    Directory of Open Access Journals (Sweden)

    Ning Guo

    Full Text Available With favorable pharmacokinetics and binding affinity for α(vβ(3 integrin, (18F-labeled dimeric cyclic RGD peptide ([(18F]FPPRGD2 has been intensively used as a PET imaging probe for lesion detection and therapy response monitoring. A recently introduced kit formulation method, which uses an (18F-fluoride-aluminum complex labeled RGD tracer ([(18F]AlF-NOTA-PRGD2, provides a strategy for simplifying the labeling procedure to facilitate clinical translation. Meanwhile, an easy-to-prepare (68Ga-labeled NOTA-PRGD2 has also been reported to have promising properties for imaging integrin α(vβ(3. The purpose of this study is to quantitatively compare the pharmacokinetic parameters of [(18F]FPPRGD2, [(18F]AlF-NOTA-PRGD2, and [(68Ga]Ga-NOTA-PRGD2. U87MG tumor-bearing mice underwent 60-min dynamic PET scans following the injection of three tracers. Kinetic parameters were calculated using Logan graphical analysis with reference tissue. Parametric maps were generated using voxel-level modeling. All three compounds showed high binding potential (Bp(ND = k(3/k(4 in tumor voxels. [(18F]AlF-NOTA-PRGD2 showed comparable Bp(ND value (3.75±0.65 with those of [(18F]FPPRGD2 (3.39±0.84 and [(68Ga]Ga-NOTA-PRGD2 (3.09±0.21 (p>0.05. Little difference was found in volume of distribution (V(T among these three RGD tracers in tumor, liver and muscle. Parametric maps showed similar kinetic parameters for all three tracers. We also demonstrated that the impact of non-specific binding could be eliminated in the kinetic analysis. Consequently, kinetic parameter estimation showed more comparable results among groups than static image analysis. In conclusion, [(18F]AlF-NOTA-PRGD2 and [(68Ga]Ga-NOTA-PRGD2 have comparable pharmacokinetics and quantitative parameters compared to those of [(18F]FPPRGD2. Despite the apparent difference in tumor uptake (%ID/g determined from static images and clearance pattern, the actual specific binding component extrapolated from kinetic

  9. Learning from Past Classification Errors: Exploring Methods for Improving the Performance of a Deep Learning-based Building Extraction Model through Quantitative Analysis of Commission Errors for Optimal Sample Selection

    Science.gov (United States)

    Swan, B.; Laverdiere, M.; Yang, L.

    2017-12-01

    In the past five years, deep Convolutional Neural Networks (CNN) have been increasingly favored for computer vision applications due to their high accuracy and ability to generalize well in very complex problems; however, details of how they function and in turn how they may be optimized are still imperfectly understood. In particular, their complex and highly nonlinear network architecture, including many hidden layers and self-learned parameters, as well as their mathematical implications, presents open questions about how to effectively select training data. Without knowledge of the exact ways the model processes and transforms its inputs, intuition alone may fail as a guide to selecting highly relevant training samples. Working in the context of improving a CNN-based building extraction model used for the LandScan USA gridded population dataset, we have approached this problem by developing a semi-supervised, highly-scalable approach to select training samples from a dataset of identified commission errors. Due to the large scope this project, tens of thousands of potential samples could be derived from identified commission errors. To efficiently trim those samples down to a manageable and effective set for creating additional training sample, we statistically summarized the spectral characteristics of areas with rates of commission errors at the image tile level and grouped these tiles using affinity propagation. Highly representative members of each commission error cluster were then used to select sites for training sample creation. The model will be incrementally re-trained with the new training data to allow for an assessment of how the addition of different types of samples affects the model performance, such as precision and recall rates. By using quantitative analysis and data clustering techniques to select highly relevant training samples, we hope to improve model performance in a manner that is resource efficient, both in terms of training process

  10. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  11. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  12. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  13. Fumonisin B1 toxicity in grower-finisher pigs: a comparative analysis of genetically engineered Bt corn and non-Bt corn by using quantitative dietary exposure assessment modeling.

    Science.gov (United States)

    Delgado, James E; Wolt, Jeffrey D

    2011-08-01

    In this study, we investigate the long-term exposure (20 weeks) to fumonisin B(1) (FB(1)) in grower-finisher pigs by conducting a quantitative exposure assessment (QEA). Our analytical approach involved both deterministic and semi-stochastic modeling for dietary comparative analyses of FB(1) exposures originating from genetically engineered Bacillus thuringiensis (Bt)-corn, conventional non-Bt corn and distiller's dried grains with solubles (DDGS) derived from Bt and/or non-Bt corn. Results from both deterministic and semi-stochastic demonstrated a distinct difference of FB(1) toxicity in feed between Bt corn and non-Bt corn. Semi-stochastic results predicted the lowest FB(1) exposure for Bt grain with a mean of 1.5 mg FB(1)/kg diet and the highest FB(1) exposure for a diet consisting of non-Bt grain and non-Bt DDGS with a mean of 7.87 mg FB(1)/kg diet; the chronic toxicological incipient level of concern is 1.0 mg of FB(1)/kg of diet. Deterministic results closely mirrored but tended to slightly under predict the mean result for the semi-stochastic analysis. This novel comparative QEA model reveals that diet scenarios where the source of grain is derived from Bt corn presents less potential to induce FB(1) toxicity than diets containing non-Bt corn.

  14. Fumonisin B1 Toxicity in Grower-Finisher Pigs: A Comparative Analysis of Genetically Engineered Bt Corn and non-Bt Corn by Using Quantitative Dietary Exposure Assessment Modeling

    Directory of Open Access Journals (Sweden)

    Jeffrey D. Wolt

    2011-07-01

    Full Text Available In this study, we investigate the long-term exposure (20 weeks to fumonisin B1 (FB1 in grower-finisher pigs by conducting a quantitative exposure assessment (QEA. Our analytical approach involved both deterministic and semi-stochastic modeling for dietary comparative analyses of FB1 exposures originating from genetically engineered Bacillus thuringiensis (Bt-corn, conventional non-Bt corn and distiller’s dried grains with solubles (DDGS derived from Bt and/or non-Bt corn. Results from both deterministic and semi-stochastic demonstrated a distinct difference of FB1 toxicity in feed between Bt corn and non-Bt corn. Semi-stochastic results predicted the lowest FB1 exposure for Bt grain with a mean of 1.5 mg FB1/kg diet and the highest FB1 exposure for a diet consisting of non-Bt grain and non-Bt DDGS with a mean of 7.87 mg FB1/kg diet; the chronic toxicological incipient level of concern is 1.0 mg of FB1/kg of diet. Deterministic results closely mirrored but tended to slightly under predict the mean result for the semi-stochastic analysis. This novel comparative QEA model reveals that diet scenarios where the source of grain is derived from Bt corn presents less potential to induce FB1 toxicity than diets containing non-Bt corn.

  15. Quantitative comparison of canopy conductance models using a Bayesian approach

    Science.gov (United States)

    Samanta, S.; Clayton, M. K.; Mackay, D. S.; Kruger, E. L.; Ewers, B. E.

    2008-09-01

    A quantitative model comparison methodology based on deviance information criterion, a Bayesian measure of the trade-off between model complexity and goodness of fit, is developed and demonstrated by comparing semiempirical transpiration models. This methodology accounts for parameter and prediction uncertainties associated with such models and facilitates objective selection of the simplest model, out of available alternatives, which does not significantly compromise the ability to accurately model observations. We use this methodology to compare various Jarvis canopy conductance model configurations, embedded within a larger transpiration model, against canopy transpiration measured by sap flux. The results indicate that descriptions of the dependence of stomatal conductance on vapor pressure deficit, photosynthetic radiation, and temperature, as well as the gradual variation in canopy conductance through the season are essential in the transpiration model. Use of soil moisture was moderately significant, but only when used with a hyperbolic vapor pressure deficit relationship. Subtle differences in model quality could be clearly associated with small structural changes through the use of this methodology. The results also indicate that increments in model complexity are not always accompanied by improvements in model quality and that such improvements are conditional on model structure. Possible application of this methodology to compare complex semiempirical models of natural systems in general is also discussed.

  16. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases

    NARCIS (Netherlands)

    T.D. Hollingsworth (T. Déirdre); E.R. Adams (Emily R.); R.M. Anderson (Roy); K. Atkins (Katherine); S. Bartsch (Sarah); M-G. Basáñez (María-Gloria); M. Behrend (Matthew); D.J. Blok (David); L.A.C. Chapman (Lloyd A. C.); L.E. Coffeng (Luc); O. Courtenay (Orin); R.E. Crump (Ron E.); S.J. de Vlas (Sake); A.P. Dobson (Andrew); L. Dyson (Louise); H. Farkas (Hajnal); A.P. Galvani (Alison P.); M. Gambhir (Manoj); D. Gurarie (David); M.A. Irvine (Michael A.); S. Jervis (Sarah); M.J. Keeling (Matt J.); L. Kelly-Hope (Louise); C. King (Charles); B.Y. Lee (Bruce Y.); E.A. le Rutte (Epke); T.M. Lietman (Thomas M.); M. Ndeffo-Mbah (Martial); G.F. Medley (Graham F.); E. Michael (Edwin); A. Pandey (Abhishek); J.K. Peterson (Jennifer K.); A. Pinsent (Amy); T.C. Porco (Travis C.); J.H. Richardus (Jan Hendrik); L. Reimer (Lisa); K.S. Rock (Kat S.); B.K. Singh (Brajendra K.); W.A. Stolk (Wilma); S. Swaminathan (Subramanian); S.J. Torr (Steve J.); J. Townsend (Jeffrey); J. Truscott (James); M. Walker (Martin); A. Zoueva (Alexandra)

    2015-01-01

    textabstractQuantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an

  17. A Review on Quantitative Models for Sustainable Food Logistics Management

    Directory of Open Access Journals (Sweden)

    M. Soysal

    2012-12-01

    Full Text Available The last two decades food logistics systems have seen the transition from a focus on traditional supply chain management to food supply chain management, and successively, to sustainable food supply chain management. The main aim of this study is to identify key logistical aims in these three phases and analyse currently available quantitative models to point out modelling challenges in sustainable food logistics management (SFLM. A literature review on quantitative studies is conducted and also qualitative studies are consulted to understand the key logistical aims more clearly and to identify relevant system scope issues. Results show that research on SFLM has been progressively developing according to the needs of the food industry. However, the intrinsic characteristics of food products and processes have not yet been handled properly in the identified studies. The majority of the works reviewed have not contemplated on sustainability problems, apart from a few recent studies. Therefore, the study concludes that new and advanced quantitative models are needed that take specific SFLM requirements from practice into consideration to support business decisions and capture food supply chain dynamics.

  18. A Transformative Model for Undergraduate Quantitative Biology Education

    Science.gov (United States)

    Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions. PMID:20810949

  19. A transformative model for undergraduate quantitative biology education.

    Science.gov (United States)

    Usher, David C; Driscoll, Tobin A; Dhurjati, Prasad; Pelesko, John A; Rossi, Louis F; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions.

  20. Towards Quantitative Systems Pharmacology Models of Chemotherapy-Induced Neutropenia.

    Science.gov (United States)

    Craig, M

    2017-05-01

    Neutropenia is a serious toxic complication of chemotherapeutic treatment. For years, mathematical models have been developed to better predict hematological outcomes during chemotherapy in both the traditional pharmaceutical sciences and mathematical biology disciplines. An increasing number of quantitative systems pharmacology (QSP) models that combine systems approaches, physiology, and pharmacokinetics/pharmacodynamics have been successfully developed. Here, I detail the shift towards QSP efforts, emphasizing the importance of incorporating systems-level physiological considerations in pharmacometrics. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  1. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  2. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  3. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    Science.gov (United States)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of 30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  4. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  5. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  6. Quantitative 3D analysis of huge nanoparticle assemblies.

    Science.gov (United States)

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M; van Blaaderen, Alfons; Batenburg, K Joost; Bals, Sara; Van Tendeloo, Gustaaf

    2016-01-07

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10,000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed.

  7. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  8. Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.

    Science.gov (United States)

    Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse

    2017-01-01

    Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.

  9. Quantitative insight into models of Hedgehog signal transduction.

    Science.gov (United States)

    Farzan, Shohreh F; Ogden, Stacey K; Robbins, David J

    2010-01-01

    The Hedgehog (Hh) signaling pathway is an essential regulator of embryonic development and a key factor in carcinogenesis.(1,2) Hh, a secreted morphogen, activates intracellular signaling events via downstream effector proteins, which translate the signal to regulate target gene transcription.(3,4) In a recent publication, we quantitatively compared two commonly accepted models of Hh signal transduction.(5) Each model requires a different ratio of signaling components to be feasible. Thus, we hypothesized that knowing the steady-state ratio of core signaling components might allow us to distinguish between models. We reported vast differences in the molar concentrations of endogenous effectors of Hh signaling, with Smo present in limiting concentrations.(5) This extra view summarizes the implications of this endogenous ratio in relation to current models of Hh signaling and places our results in the context of recent work describing the involvement of guanine nucleotide binding protein Galphai and Cos2 motility.

  10. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  11. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm...... quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. CONCLUSION: The presented method is amply described as a prestain multicomponent quantitation...

  12. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  13. A quantitative analysis of IRAS maps of molecular clouds

    Science.gov (United States)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  14. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  15. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  16. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  17. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  18. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  19. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  20. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  2. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  3. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  4. Quantitative analysis of peel-off degree for printed electronics

    Science.gov (United States)

    Park, Janghoon; Lee, Jongsu; Sung, Ki-Hak; Shin, Kee-Hyun; Kang, Hyunkyoo

    2018-02-01

    We suggest a facile methodology of peel-off degree evaluation by image processing on printed electronics. The quantification of peeled and printed areas was performed using open source programs. To verify the accuracy of methods, we manually removed areas from the printed circuit that was measured, resulting in 96.3% accuracy. The sintered patterns showed a decreasing tendency in accordance with the increase in the energy density of an infrared lamp, and the peel-off degree increased. Thus, the comparison between both results was presented. Finally, the correlation between performance characteristics was determined by quantitative analysis.

  5. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    in a number work cases. These mainly falls in the three categories: (i) Description of coarse scale measures to quantify surface structure or texture (topography); (ii) Characterization of fracture surfaces in steels (fractography); (iii) Grain boundary segmentation in sintered ceramics. The theoretical...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  6. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  7. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  8. Quantitative aspects and dynamic modelling of glucosinolate metabolism

    DEFF Research Database (Denmark)

    Vik, Daniel

    and ecologically important glucosinolate (GLS) compounds of cruciferous plants – including the model plant Arabidopsis thaliana – have been studied extensively with regards to their biosynthesis and degradation. However, efforts to construct a dynamic model unifying the regulatory aspects have not been made......Advancements in ‘omics technologies now allow acquisition of enormous amounts of quantitative information about biomolecules. This has led to the emergence of new scientific sub‐disciplines e.g. computational, systems and ‘quantitative’ biology. These disciplines examine complex biological...... behaviour through computational and mathematical approaches and have resulted in substantial insights and advances in molecular biology and physiology. Capitalizing on the accumulated knowledge and data, it is possible to construct dynamic models of complex biological systems, thereby initiating the so...

  9. Quantifying Zika: Advancing the Epidemiology of Zika With Quantitative Models.

    Science.gov (United States)

    Keegan, Lindsay T; Lessler, Justin; Johansson, Michael A

    2017-12-16

    When Zika virus (ZIKV) emerged in the Americas, little was known about its biology, pathogenesis, and transmission potential, and the scope of the epidemic was largely hidden, owing to generally mild infections and no established surveillance systems. Surges in congenital defects and Guillain-Barré syndrome alerted the world to the danger of ZIKV. In the context of limited data, quantitative models were critical in reducing uncertainties and guiding the global ZIKV response. Here, we review some of the models used to assess the risk of ZIKV-associated severe outcomes, the potential speed and size of ZIKV epidemics, and the geographic distribution of ZIKV risk. These models provide important insights and highlight significant unresolved questions related to ZIKV and other emerging pathogens. Published by Oxford University Press for the Infectious Diseases Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  10. Detection of Prostate Cancer: Quantitative Multiparametric MR Imaging Models Developed Using Registered Correlative Histopathology.

    Science.gov (United States)

    Metzger, Gregory J; Kalavagunta, Chaitanya; Spilseth, Benjamin; Bolan, Patrick J; Li, Xiufeng; Hutter, Diane; Nam, Jung W; Johnson, Andrew D; Henriksen, Jonathan C; Moench, Laura; Konety, Badrinath; Warlick, Christopher A; Schmechel, Stephen C; Koopmeiners, Joseph S

    2016-06-01

    Purpose To develop multiparametric magnetic resonance (MR) imaging models to generate a quantitative, user-independent, voxel-wise composite biomarker score (CBS) for detection of prostate cancer by using coregistered correlative histopathologic results, and to compare performance of CBS-based detection with that of single quantitative MR imaging parameters. Materials and Methods Institutional review board approval and informed consent were obtained. Patients with a diagnosis of prostate cancer underwent multiparametric MR imaging before surgery for treatment. All MR imaging voxels in the prostate were classified as cancer or noncancer on the basis of coregistered histopathologic data. Predictive models were developed by using more than one quantitative MR imaging parameter to generate CBS maps. Model development and evaluation of quantitative MR imaging parameters and CBS were performed separately for the peripheral zone and the whole gland. Model accuracy was evaluated by using the area under the receiver operating characteristic curve (AUC), and confidence intervals were calculated with the bootstrap procedure. The improvement in classification accuracy was evaluated by comparing the AUC for the multiparametric model and the single best-performing quantitative MR imaging parameter at the individual level and in aggregate. Results Quantitative T2, apparent diffusion coefficient (ADC), volume transfer constant (K(trans)), reflux rate constant (kep), and area under the gadolinium concentration curve at 90 seconds (AUGC90) were significantly different between cancer and noncancer voxels (P models demonstrated the best performance in both the peripheral zone (AUC, 0.85; P = .010 vs ADC alone) and whole gland (AUC, 0.77; P = .043 vs ADC alone). Individual-level analysis showed statistically significant improvement in AUC in 82% (23 of 28) and 71% (24 of 34) of patients with peripheral-zone and whole-gland models, respectively, compared with ADC alone. Model-based CBS

  11. Development of quantitative atomic modeling for tungsten transport study Using LHD plasma with tungsten pellet injection

    International Nuclear Information System (INIS)

    Murakami, I.; Sakaue, H.A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2014-10-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from currentless plasmas of the Large Helical Device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) lines of W 24+ to W 33+ ions are very sensitive to electron temperature (Te) and useful to examine the tungsten behavior in edge plasmas. Based on the first quantitative analysis of measured spatial profile of W 44+ ion, the tungsten concentration is determined to be n(W 44+ )/n e = 1.4x10 -4 and the total radiation loss is estimated as ∼4 MW, of which the value is roughly half the total NBI power. (author)

  12. Quantitative T2 Combined with Texture Analysis of Nuclear Magnetic Resonance Images Identify Different Degrees of Muscle Involvement in Three Mouse Models of Muscle Dystrophy: mdx, Largemyd and mdx/Largemyd

    Science.gov (United States)

    Martins-Bach, Aurea B.; Malheiros, Jackeline; Matot, Béatrice; Martins, Poliana C. M.; Almeida, Camila F.; Caldeira, Waldir; Ribeiro, Alberto F.; Loureiro de Sousa, Paulo; Azzabou, Noura; Tannús, Alberto; Carlier, Pierre G.; Vainzof, Mariz

    2015-01-01

    Quantitative nuclear magnetic resonance imaging (MRI) has been considered a promising non-invasive tool for monitoring therapeutic essays in small size mouse models of muscular dystrophies. Here, we combined MRI (anatomical images and transverse relaxation time constant—T2—measurements) to texture analyses in the study of four mouse strains covering a wide range of dystrophic phenotypes. Two still unexplored mouse models of muscular dystrophies were analyzed: The severely affected Largemyd mouse and the recently generated and worst double mutant mdx/Largemyd mouse, as compared to the mildly affected mdx and normal mice. The results were compared to histopathological findings. MRI showed increased intermuscular fat and higher muscle T2 in the three dystrophic mouse models when compared to the wild-type mice (T2: mdx/Largemyd: 37.6±2.8 ms; mdx: 35.2±4.5 ms; Largemyd: 36.6±4.0 ms; wild-type: 29.1±1.8 ms, p<0.05), in addition to higher muscle T2 in the mdx/Largemyd mice when compared to mdx (p<0.05). The areas with increased muscle T2 in the MRI correlated spatially with the identified histopathological alterations such as necrosis, inflammation, degeneration and regeneration foci. Nevertheless, muscle T2 values were not correlated with the severity of the phenotype in the 3 dystrophic mouse strains, since the severely affected Largemyd showed similar values than both the mild mdx and worst mdx/Largemyd lineages. On the other hand, all studied mouse strains could be unambiguously identified with texture analysis, which reflected the observed differences in the distribution of signals in muscle MRI. Thus, combined T2 intensity maps and texture analysis is a powerful approach for the characterization and differentiation of dystrophic muscles with diverse genotypes and phenotypes. These new findings provide important noninvasive tools in the evaluation of the efficacy of new therapies, and most importantly, can be directly applied in human translational research

  13. Quantitative spectroscopic analysis of and distance to SN1999em

    Science.gov (United States)

    Dessart, L.; Hillier, D. J.

    2006-02-01

    Multi-epoch multi-wavelength spectroscopic observations of photospheric-phase type II supernovae (SN) provide information on massive-star progenitor properties, the core-collapse mechanism, and distances in the Universe. Following successes of recent endeavors (Dessart & Hillier 2005a, A&A, 437, 667; 2005b, A&A, 439, 671) with the non-LTE model atmosphere code CMFGEN (Hillier & Miller 1998, ApJ, 496, 407), we present a detailed quantitative spectroscopic analysis of the type II SN1999em and, using the Expanding Photosphere Method (EPM) or synthetic fits to observed spectra, à la Baron et al. (2004, ApJ, 616, 91), we estimate its distance. Selecting eight epochs, which cover the first 38 days after discovery, we obtain satisfactory fits to optical spectroscopic observations of SN1999em (including the UV and near-IR ranges when available). We use the same iron-group metal content for the ejecta, the same power-law density distribution (with exponent n = 10{-}12), and a Hubble-velocity law at all times. We adopt a H/He/C/N/O abundance pattern compatible with CNO-cycle equilibrium values for a RSG/BSG progenitor, with C/O enhanced and N depleted at later times. The overall evolution of the spectral energy distribution, whose peak shifts to longer wavelengths as time progresses, reflects the steady temperature/ionization-level decrease of the ejecta, associated non-linearly with a dramatic shift to ions with stronger line-blocking powers in the UV and optical (Fe ii, Tiii). In the parameter space investigated, CMFGEN is very sensitive and provides photospheric temperatures and velocities, reddenings, and the H/He abundance ratio with an accuracy of ±500 K, ±10%, 0.05 and 50%, respectively. Following Leonard et al. (2002, PASP, 114, 35), and their use of correction factors from Hamuy et al. (2001, ApJ, 558, 615), we estimate an EPM distance to SN1999em that also falls 30% short of the Cepheid distance of 11.7 Mpc to its host galaxy NGC 1637 (Leonard et al. 2003, Ap

  14. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  15. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  16. Preoperative Prediction of Microvascular Invasion in Hepatocellular Carcinoma Using Quantitative Image Analysis.

    Science.gov (United States)

    Zheng, Jian; Chakraborty, Jayasree; Chapman, William C; Gerst, Scott; Gonen, Mithat; Pak, Linda M; Jarnagin, William R; DeMatteo, Ronald P; Do, Richard K G; Simpson, Amber L

    2017-12-01

    Microvascular invasion (MVI) is a significant risk factor for early recurrence after resection or transplantation for hepatocellular carcinoma (HCC). Knowledge of MVI status would help guide treatment recommendations, but is generally identified after operation. This study aims to predict MVI preoperatively using quantitative image analysis. One hundred and twenty patients from 2 institutions underwent resection of HCC from 2003 to 2015 were included. The largest tumor from preoperative CT was subjected to quantitative image analysis, which uses an automated computer algorithm to capture regional variation in CT enhancement patterns. Quantitative imaging features by automatic analysis, qualitative radiographic descriptors by 2 radiologists, and preoperative clinical variables were included in multivariate analysis to predict histologic MVI. Histologic MVI was identified in 19 (37%) patients with tumors ≤5 cm and 34 (49%) patients with tumors >5 cm. Among patients with tumors ≤5 cm, none of the clinical findings or radiographic descriptors were associated with MVI; however, quantitative features based on angle co-occurrence matrix predicted MVI with an area under curve of 0.80, positive predictive value of 63%, and negative predictive value of 85%. In patients with tumors >5 cm, higher α-fetoprotein level, larger tumor size, and viral hepatitis history were associated with MVI, and radiographic descriptors were not. However, a multivariate model combining α-fetoprotein, tumor size, hepatitis status, and quantitative feature based on local binary pattern predicted MVI with area under curve of 0.88, positive predictive value of 72%, and negative predictive value of 96%. This study reveals the potential importance of quantitative image analysis as a predictor of MVI. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Quantitative Raman spectroscopy for the analysis of carrot bioactives.

    Science.gov (United States)

    Killeen, Daniel P; Sansom, Catherine E; Lill, Ross E; Eason, Jocelyn R; Gordon, Keith C; Perry, Nigel B

    2013-03-20

    Rapid quantitative near-infrared Fourier transform Raman analyses of the key phytonutrients in carrots, polyacetylenes and carotenoids, are reported here for the first time. Solvent extracts of 31 carrot lines were analyzed for these phytonutrients by conventional methods, polyacetylenes by GC-FID and carotenoids by visible spectrophotometry. Carotenoid concentrations were 0-5586 μg g(-1) dry weight (DW). Polyacetylene concentrations were 74-4846 μg g(-1) DW, highest in wild carrots. The polyacetylenes were falcarinol, 6-1237 μg g(-1) DW; falcarindiol, 42-3475 μg g(-1) DW; and falcarindiol 3-acetate, 27-649 μg g(-1) DW. Strong Raman bands for carotenoids gave good correlation to results by visible spectrophotometry. A chemometric model capable of quantitating carotenoids from Raman data was developed. A classification model for rapidly distinguishing carrots with high and low polyacetylene (limit of detection = 1400 μg g(-1)) concentrations based on Raman spectral intensity in the region of 2250 cm(-1) was produced.

  18. [Biological analysis of proteinuria in the laboratory: quantitative features].

    Science.gov (United States)

    Le Bricon, T

    2001-01-01

    Total protein analysis is one of the most frequent laboratory analyses in urine. A proteinuria above 150 mg/L is often observed in a random way in preventive or school medicine (dipsticks) or during laboratory analysis (quantitative determination). Complete (quantitative, then qualitative) and repeated evaluation of proteinuria is of major interest for the clinician to establish a diagnosis of abnormality and for therapeutic follow-up of a nephropathy, uropathy or a non-renal disease (diabetes, multiple myeloma). Most frequent (90% of cases) and severe forms of proteinuria are of glomerular type, associated to the nephrotic syndrome, hypertension, and progressive renal failure. Attention should be paid by the biologist to the pre-analytical phase (specimen collection, treatment, and storage), to clinical data, and to prescription of drugs that could interfere with protein analysis. During the last past 10 years, significant analytical advances have been made: dipstick analysis has been dropped (false positives and most importantly false negatives) as manual precipitation techniques with turbidimetric detection (poor inter-laboratory coefficients of variation, CV), replacement of Coomassie blue by pyrogallol red (improved practicability). Urinary quality control data reflect these positive changes, as demonstrated by a dramatic reduction in reported CVs. There is, however, still no reference method for total urinary protein determination and limits of existing pyrogallol red methods should be emphasized: variable reagent composition between manufacturers (such as the presence of SDS additive), limited sensitivity, difficulty in the choice of a calibration material, underestimation of free light chains, and interference with gelatin based vascular replacement fluids.

  19. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    2000-07-01

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  20. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    Science.gov (United States)

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  1. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    Science.gov (United States)

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (Pquantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  2. Quantitative Analysis of Human Red Blood Cell Proteome.

    Science.gov (United States)

    Bryk, Agata H; Wiśniewski, Jacek R

    2017-08-04

    Red blood cells (RBCs) are the most abundant cell type in the human body. RBCs and, in particular, their plasma membrane composition have been extensively studied for many years. During the past decade proteomics studies have extended our knowledge on RBC composition; however, these studies did not provide quantitative insights. Here we report a large-scale proteomics investigation of RBCs and their "white ghost" membrane fraction. Samples were processed using the multienzyme digestion filter-aided sample preparation (MED-FASP) and analyzed using Q-Exactive mass spectrometer. Protein abundances were computed using the total protein approach (TPA). The validation of the data with stable isotope-labeled peptide-based protein quantification followed. Our in-depth analysis resulted in the identification of 2650 proteins, of which 1890 occurred at more than 100 copies per cell. We quantified 41 membrane transporter proteins spanning an abundance range of five orders of magnitude. Some of these, including the drug transporter ABCA7 and choline transporters SLC44A1 and SLC44A2, have not previously been identified in RBC membranes. Comparison of protein copy numbers assessed by proteomics showed a good correlation with literature data; however, abundances of several proteins were not consistent with the classical references. Because we validated our findings by a targeted analysis using labeled standards, our data suggest that some older reference data from a variety of biochemical approaches are inaccurate. Our study provides the first "in-depth" quantitative analysis of the RBC proteome and will promote future studies of erythrocyte structure, functions, and disease.

  3. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  4. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  5. Quantitative Analysis of Airway Walls Using CT Software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Jung; Goo, Jin Mo; Lee, Kyung Won; Lee, Hyun Ju; Kim, Kwang Gi; Im, Jung Gi [Seoul National University, Seoul (Korea, Republic of)

    2008-12-15

    The purpose of this study was to develop dedicated software for quantitative analysis of the airways and to validate the software using airway phantoms and excised swine lung. The dedicated software was validated in airway phantoms and excised swine lung through comparison of the actual values with the measurements acquired with dedicated software. The accuracy of the measurements according to the reconstruction methods (standard, lung, sharp) and spatial resolution were compared using airway phantoms. Repeatability of the measurement of airway phantoms was assessed with follow-up CT scans three months later. Airway dimension measurements obtained in airway phantoms and excised swine lung showed good agreements with actual values. Airway measurements were more accurate when the sharp reconstruction algorithm was used and when the spatial resolution was improved using pixels smaller than conventional size. There was good agreement between the initial airway measurements and those obtained three months later. We developed and validated dedicated software for quantitative airway measurement. Reconstruction with sharp algorithms and high spatial resolution images is recommended for obtaining airway measurements.

  6. Qualitative and quantitative analysis of volatile constituents from latrines.

    Science.gov (United States)

    Lin, Jianming; Aoll, Jackline; Niclass, Yvan; Velazco, Maria Inés; Wünsche, Laurent; Pika, Jana; Starkenmann, Christian

    2013-07-16

    More than 2.5 billion people defecate in the open. The increased commitment of private and public organizations to improving this situation is driving the research and development of new technologies for toilets and latrines. Although key technical aspects are considered by researchers when designing new technologies for developing countries, the basic aspect of offending malodors from human waste is often neglected. With the objective of contributing to technical solutions that are acceptable to global consumers, we investigated the chemical composition of latrine malodors sampled in Africa and India. Field latrines in four countries were evaluated olfactively and the odors qualitatively and quantitatively characterized with three analytical techniques. Sulfur compounds including H2S, methyl mercaptan, and dimethyl-mono-(di;tri) sulfide are important in sewage-like odors of pit latrines under anaerobic conditions. Under aerobic conditions, in Nairobi for example, paracresol and indole reached concentrations of 89 and 65 μg/g, respectively, which, along with short chain fatty acids such as butyric acid (13 mg/g) explained the strong rancid, manure and farm yard odor. This work represents the first qualitative and quantitative study of volatile compounds sampled from seven pit latrines in a variety of geographic, technical, and economic contexts in addition to three single stools from India and a pit latrine model system.

  7. Quantitative analysis and classification of AFM images of human hair.

    Science.gov (United States)

    Gurden, S P; Monteiro, V F; Longo, E; Ferreira, M M C

    2004-07-01

    The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty-eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly.

  8. Quantitative analysis of the reconstruction performance of interpolants

    Science.gov (United States)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  9. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    Science.gov (United States)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  10. Quantitative analysis of impact measurements using dynamic load cells

    Directory of Open Access Journals (Sweden)

    Brent J. Maranzano

    2016-03-01

    Full Text Available A mathematical model is used to estimate material properties from a short duration transient impact force measured by dropping spheres onto rectangular coupons fixed to a dynamic load cell. The contact stress between the dynamic load cell surface and the projectile are modeled using Hertzian contact mechanics. Due to the short impact time relative to the load cell dynamics, an additional Kelvin–Voigt element is included in the model to account for the finite response time of the piezoelectric crystal. Calculations with and without the Kelvin–Voigt element are compared to experimental data collected from combinations of polymeric spheres and polymeric and metallic surfaces. The results illustrate that the inclusion of the Kelvin–Voigt element qualitatively captures the post impact resonance and non-linear behavior of the load cell signal and quantitatively improves the estimation of the Young's elastic modulus and Poisson's ratio. Mathematically, the additional KV element couples one additional differential equation to the Hertzian spring-dashpot equation. The model can be numerically integrated in seconds using standard numerical techniques allowing for its use as a rapid technique for the estimation of material properties. Keywords: Young's modulus, Poisson's ratio, Dynamic load cell

  11. Landslide quantitative risk analysis of buildings at the municipal scale based on a rainfall triggering scenario

    OpenAIRE

    Pereira, Susana; Garcia, Ricardo A. C.; Zêzere, José Luís; Oliveira, Sérgio Cruz; Silva, Márcio

    2016-01-01

    A landslide quantitative risk analysis is applied the municipality of Santa Marta de Penagui~ao (N of Portugal) to evaluate the risk to which the buildings are exposed, using a vector data model in GIS. Two landslide subgroups were considered: landslide subgroup 1 (event inventory of landslides occurred on January 200)1; and landslide subgroup 2 (inventoried landslides occurred after the 2001 event until 2010). Seven landslide predisposing factors were weighted and integrate...

  12. Quantitative analysis of spatial variability of geotechnical parameters

    Science.gov (United States)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  13. Sexual Scripts in Contemporary Mexican Cinema: A Quantitative Content Analysis.

    Science.gov (United States)

    Kollath-Cattano, Christy L; Mann, Emily S; Zegbe, Estephania Moreno; Thrasher, James F

    2018-03-01

    While the literature on sexual scripts is substantive, with some scholarship examining the role of popular media in the production of dominant and divergent sexual practices and interactions, limited attention has been paid to the contemporary Mexican context. In this article, we share findings from a quantitative content analysis of popular Mexican films in order to explore how sexual behavior is portrayed and more specifically how relationship characteristics, condom use, and substance use interact with representations of sexual behavior. We find that more sexually explicit portrayals featured people engaged in heterosexual sexual interactions outside the context of marriage and also in age discordant relationships, where one partner was a minor. Few films featured safer sex practices or substance use in concert with sexual behavior. This research sheds light on how film as a powerful agent of socialization communicates sexual scripts in contemporary Mexican culture that may contribute to risky sexual behaviors among Mexican youth.

  14. Comprehensive wellbore stability analysis utilizing Quantitative Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Moos, Daniel; Peska, Pavel; Finkbeiner, Thomas [GeoMechanics International, Palo Alto, CA 94303 (United States); Zoback, Mark [Stanford University, Stanford, CA 94305 (United States)

    2003-06-01

    A comprehensive geomechanical approach to wellbore stability requires knowledge of rock strength, pore pressure and the magnitude and orientation of the three principal stresses. These parameters are often uncertain, making confidence in deterministic predictions of the risks associated with instabilities during drilling and production difficult to assess. This paper demonstrates the use of Quantitative Risk Assessment (QRA) to formally account for the uncertainty in each input parameter to assess the probability of achieving a desired degree of wellbore stability at a given mud weight. We also utilize QRA to assess how the uncertainty in each parameter affects the mud weight calculated to maintain stability. In one case study, we illustrate how this approach allows us to compute optimal mud weight windows and casing set points at a deep-water site. In another case study, we demonstrate how to assess the feasibility of underbalanced drilling and open-hole completion of horizontal wells utilizing a comprehensive stability analysis that includes application of QRA.

  15. Quantitative analysis of forest island pattern in selected Ohio landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  16. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Tavares, Leila Aley; Ferreira, Antonio Gilberto

    2006-01-01

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1 H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  17. [Quantitative Analysis of Power Doppler Images in Lateral Humeral Enthesopathy].

    Science.gov (United States)

    Walder, P; Paša, L; Pavliska, L

    2016-01-01

    PURPOSE OF THE STUDY The evaluation of efficiency of power Doppler sonography in the diagnosis of lateral humeral enthesopathy, role of correct assessment of Doppler sonographic images with the method of quantitative analysis, assessment of statistical differences between a group of patients with lateral humeral enthesopathy and a control group of healthy subjects and assessment of the diagnostic power of this test. In addition, consideration of the relevance of each area of the lateral compartment for assessment and diagnosis making in lateral humeral enthesopathy. MATERIAL AND METHODS A total of 41 subjects, aged 18 to 60 years, entered the study. Thirteen patients were diagnosed with lateral humeral enthesopathy on the basis of clinical tests and a positive reaction of the lateral humeral epicondylus to administration of local anaesthetic. The control group consisted of 28 subjects without clinical signs of lateral humeral enthesopathy and subjective complaints. Power Doppler activity was evaluated in the whole region studied and in sub-regions involving the enthesis of the common extensor tendon and the periosteum of the lateral epicondyle with the area distal to it. The evaluation was based on calculating the overall surface with power Doppler activity using the method of quantitative image analysis. Each patient was measured on three occasions and the median of values obtained was used in calculation. To assess the diagnostic power of this test, all values obtained from the whole power Doppler region measured were used. The optimal dividing criterion at which the method had a maximum of sensitivity and specificity was determined. RESULTS The most evident, statistically significant difference between the patient and the control group was recorded in the whole "Range of Interest" (ROI) region (p=1.34x10-6). A significant difference was also found in sub-regions corresponding chiefly to the tendon of the extensor carpi radialis brevis muscle and to the

  18. Physical aspects of quantitative particles analysis by X-ray fluorescence and electron microprobe techniques

    International Nuclear Information System (INIS)

    Markowicz, A.

    1986-01-01

    The aim of this work is to present both physical fundamentals and recent advances in quantitative particles analysis by X-ray fluorescence (XRF) and electron microprobe (EPXMA) techniques. A method of correction for the particle-size effect in XRF analysis is described and theoretically evaluated. New atomic number- and absorption correction procedures in EPXMA of individual particles are proposed. The applicability of these two correction methods is evaluated for a wide range of elemental composition, X-ray energy and sample thickness. Also, a theoretical model for composition and thickness dependence of Bremsstrahlung background generated in multielement bulk specimens as well as thin films and particles are presented and experimantally evaluated. Finally, the limitations and further possible improvements in quantitative particles analysis by XFR and EPXMA are discussed. 109 refs. (author)

  19. Quantitative Modelling of Trace Elements in Hard Coal.

    Science.gov (United States)

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross-validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment.

  20. [Feasibility of the extended application of near infrared universal quantitative models].

    Science.gov (United States)

    Lei, De-Qing; Hu, Chang-Qin; Feng, Yan-Chun; Feng, Fang

    2010-11-01

    Construction of a successful near infrared analysis model is a complex task. It spends a lot of manpower and material resources, and is restricted by sample collection and model optimization. So it is important to study on the extended application of the existing near infrared (NIR) models. In this paper, cephradine capsules universal quantitative model was used as an example to study on the feasibility of its extended application. Slope/bias correction and piecewise direct standardization correction methods were used to make the universal model to fit to predict the intermediates in manufacturing processes of cephradine capsules, such as the content of powder blend or granules. The results showed that the corrected NIR universal quantitative model can be used for process control although the results of the model correction by slope/bias or piecewise direct standardization were not as good as that of model updating. And it also indicated that the model corrected by slope/bias is better than that by piecewise direct standardization. Model correction provided a new application for NIR universal models in process control.

  1. Quantitative analysis of multiple components based on liquid chromatography with mass spectrometry in full scan mode.

    Science.gov (United States)

    Xu, Min Li; Li, Bao Qiong; Wang, Xue; Chen, Jing; Zhai, Hong Lin

    2016-08-01

    Although liquid chromatography with mass spectrometry in full scan mode can obtain all the signals simultaneously in a large range and low cost, it is rarely used in quantitative analysis due to several problems such as chromatographic drifts and peak overlap. In this paper, we propose a Tchebichef moment method for the simultaneous quantitative analysis of three active compounds in Qingrejiedu oral liquid based on three-dimensional spectra in full scan mode of liquid chromatography with mass spectrometry. After the Tchebichef moments were calculated directly from the spectra, the quantitative linear models for three active compounds were established by stepwise regression. All the correlation coefficients were more than 0.9978. The limits of detection and limits of quantitation were less than 0.11 and 0.49 μg/mL, respectively. The intra- and interday precisions were less than 6.54 and 9.47%, while the recovery ranged from 102.56 to 112.15%. Owing to the advantages of multi-resolution and inherent invariance properties, Tchebichef moments could provide favorable results even in the situation of peaks shifting and overlapping, unknown interferences and noise signals, so it could be applied to the analysis of three-dimensional spectra in full scan mode of liquid chromatography with mass spectrometry. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Quantitative TEM analysis of a hexagonal mesoporous silicate structure.

    Science.gov (United States)

    Hudson, S; Tanner, D A; Redington, W; Magner, E; Hodnett, K; Nakahara, S

    2006-08-07

    TEM analysis of mesoporous materials is generally undertaken to give qualitative results. Accurate quantitative analysis is demonstrated in this study. A systematic image analysis of a powder form of a hexagonal mesoporous material known as KIT-6 is conducted using a transmission electron microscope (TEM). Three types of image contrast typically appear in this material (a hexagonal honeycomb structure, wide and narrow parallel lines). The honeycomb face is used to characterise this material in terms of a conventional 2-D hexagonal structure and the d-spacings for the (100) and (110) planes are experimentally measured in varying focus conditions. A tilting experiment is conducted to determine how the angle of tilt affects the line spacing and their visibility. Tilting has very little effect on the line spacing, whereas it affects the visibility of both the wide and narrow lines by limiting an angle range of visibility. The hexagonal lattice structure parameter determined by TEM method is found to be approximately 7% lower than that calculated by low-angle X-ray diffraction. Thus we conclude that TEM data can be used to determine the geometry and dimensions of hexagonal mesoporous silica materials, with a small error in the hexagonal lattice parameter.

  3. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  4. Melanoma screening: Informing public health policy with quantitative modelling.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in

  5. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    problem was the basis for an inverse problem, i.e. revealing depth of a body occurrence, its location in a space as well as determining physical properties. At the same time, this method has not received a wide practical application taking into account complexity of real geological media. Careful analysis piezo- and seismoelectric anomalies shows the possibility of application of quantitative analysis of these effects advanced methodologies developed in magnetic prospecting for complex physical-geological conditions (Eppelbaum et al., 2000, 2001, 2010; Eppelbaum, 2010; 2011, 2015). Employment of these methodologies (improved modifications of tangents, characteristic points areal methods) for obtaining quantitative characteristics of ore bodies, environmental features and archaeological targets (models of horizontal circular cylinder, sphere, thin bed, thick bed and thin horizontal plate were utilized) have demonstrated their effectiveness. Case study at the archaeological site Tel Kara Hadid Field piezoelectric observations were conducted at the ancient archaeological site Tel Kara Hadid with gold-quartz mineralization in southern Israel within the Precambrian terrain at the northern extension of the Arabian-Nubian Shield (Neishtadt et al., 2006). The area of the archaeological site is located eight kilometers north of the town of Eilat, in an area of strong industrial noise. Ancient river alluvial terraces (extremely heterogeneous at a local scale, varying from boulders to silt) cover the quartz veins and complicate their identification. Piezoelectric measurements conducted over a quartz vein covered by surface sediments (approximately of 0.4 m thickness) produced a sharp (500 μV ) piezoelectric anomaly. Values recorded over the host rocks (clays and shales of basic composition) were close to zero. The observed piezoelectric anomaly was successfully interpreted by the use of methodologies developed in magnetic prospecting. For effective integration of piezo- and

  6. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  7. Fumonisin B1 Toxicity in Grower-Finisher Pigs: A Comparative Analysis of Genetically Engineered Bt Corn and non-Bt Corn by Using Quantitative Dietary Exposure Assessment Modeling

    OpenAIRE

    Delgado, James E.; Wolt, Jeffrey D.

    2011-01-01

    In this study, we investigate the long-term exposure (20 weeks) to fumonisin B1 (FB1) in grower-finisher pigs by conducting a quantitative exposure assessment (QEA). Our analytical approach involved both deterministic and semi-stochastic modeling for dietary comparative analyses of FB1 exposures originating from genetically engineered Bacillus thuringiensis (Bt)-corn, conventional non-Bt corn and distiller’s dried grains with solubles (DDGS) derived from Bt and/or non-Bt corn. Results from bo...

  8. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  9. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  10. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Science.gov (United States)

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  11. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    Science.gov (United States)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2) the use of a narrow frequency band for acquisition, (3) the inability to identify P and S waves given the small

  12. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  13. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    Science.gov (United States)

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  14. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  15. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  16. Highly Multiplexed Quantitative Mass Spectrometry Analysis of Ubiquitylomes.

    Science.gov (United States)

    Rose, Christopher M; Isasa, Marta; Ordureau, Alban; Prado, Miguel A; Beausoleil, Sean A; Jedrychowski, Mark P; Finley, Daniel J; Harper, J Wade; Gygi, Steven P

    2016-10-26

    System-wide quantitative analysis of ubiquitylomes has proven to be a valuable tool for elucidating targets and mechanisms of the ubiquitin-driven signaling systems, as well as gaining insights into neurodegenerative diseases and cancer. Current mass spectrometry methods for ubiquitylome detection require large amounts of starting material and rely on stochastic data collection to increase replicate analyses. We describe a method compatible with cell line and tissue samples for large-scale quantification of 5,000-9,000 ubiquitylation forms across ten samples simultaneously. Using this method, we reveal site-specific ubiquitylation in mammalian brain and liver tissues, as well as in cancer cells undergoing proteasome inhibition. To demonstrate the power of the approach for signal-dependent ubiquitylation, we examined protein and ubiquitylation dynamics for mitochondria undergoing PARKIN- and PINK1-dependent mitophagy. This analysis revealed the largest collection of PARKIN- and PINK1-dependent ubiquitylation targets to date in a single experiment, and it also revealed a subset of proteins recruited to the mitochondria during mitophagy. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  18. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  19. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    Science.gov (United States)

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  20. Variable selection in near infrared spectroscopy for quantitative models of homologous analogs of cephalosporins

    Directory of Open Access Journals (Sweden)

    Yan-Chun Feng

    2014-07-01

    Full Text Available Two universal spectral ranges (4550–4100 cm-1 and 6190–5510 cm-1 for construction of quantitative models of homologous analogs of cephalosporins were proposed by evaluating the performance of five spectral ranges and their combinations, using three data sets of cephalosporins for injection, i.e., cefuroxime sodium, ceftriaxone sodium and cefoperazone sodium. Subsequently, the proposed ranges were validated by using eight calibration sets of other homologous analogs of cephalosporins for injection, namely cefmenoxime hydrochloride, ceftezole sodium, cefmetazole, cefoxitin sodium, cefotaxime sodium, cefradine, cephazolin sodium and ceftizoxime sodium. All the constructed quantitative models for the eight kinds of cephalosporins using these universal ranges could fulfill the requirements for quick quantification. After that, competitive adaptive reweighted sampling (CARS algorithm and infrared (IR–near infrared (NIR two-dimensional (2D correlation spectral analysis were used to determine the scientific basis of these two spectral ranges as the universal regions for the construction of quantitative models of cephalosporins. The CARS algorithm demonstrated that the ranges of 4550–4100 cm-1 and 6190–5510 cm-1 included some key wavenumbers which could be attributed to content changes of cephalosporins. The IR–NIR 2D spectral analysis showed that certain wavenumbers in these two regions have strong correlations to the structures of those cephalosporins that were easy to degrade.

  1. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  2. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Wu

    Full Text Available ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  3. From classical genetics to quantitative genetics to systems biology: modeling epistasis.

    Directory of Open Access Journals (Sweden)

    David L Aylor

    2008-03-01

    Full Text Available Gene expression data has been used in lieu of phenotype in both classical and quantitative genetic settings. These two disciplines have separate approaches to measuring and interpreting epistasis, which is the interaction between alleles at different loci. We propose a framework for estimating and interpreting epistasis from a classical experiment that combines the strengths of each approach. A regression analysis step accommodates the quantitative nature of expression measurements by estimating the effect of gene deletions plus any interaction. Effects are selected by significance such that a reduced model describes each expression trait. We show how the resulting models correspond to specific hierarchical relationships between two regulator genes and a target gene. These relationships are the basic units of genetic pathways and genomic system diagrams. Our approach can be extended to analyze data from a variety of experiments, multiple loci, and multiple environments.

  4. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  5. An efficient approach to the quantitative analysis of humic acid in water.

    Science.gov (United States)

    Wang, Xue; Li, Bao Qiong; Zhai, Hong Lin; Xiong, Meng Yi; Liu, Ying

    2016-01-01

    Rayleigh and Raman scatterings inevitably appear in fluorescence measurements, which make the quantitative analysis more difficult, especially in the overlap of target signals and scattering signals. Based on the grayscale images of three-dimensional fluorescence spectra, the linear model with two selected Zernike moments was established for the determination of humic acid, and applied to the quantitative analysis of the real sample taken from the Yellow River. The correlation coefficient (R(2)) and leave-one-out cross validation correlation coefficient (R(2)cv) were up to 0.9994 and 0.9987, respectively. The average recoveries were reached 96.28%. Compared with N-way partial least square and alternating trilinear decomposition methods, our approach was immune from the scattering and noise signals owing to its powerful multi-resolution characteristic and the obtained results were more reliable and accurate, which could be applied in food analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Qualitative and quantitative analysis of atmospheric organosulfates in Centreville, Alabama

    Science.gov (United States)

    Hettiyadura, Anusha P. S.; Jayarathne, Thilina; Baumann, Karsten; Goldstein, Allen H.; de Gouw, Joost A.; Koss, Abigail; Keutsch, Frank N.; Skog, Kate; Stone, Elizabeth A.

    2017-01-01

    Organosulfates are components of secondary organic aerosols (SOA) that form from oxidation of volatile organic compounds (VOCs) in the presence of sulfate. In this study, the composition and abundance of organosulfates were determined in fine particulate matter (PM2.5) collected from Centreville, AL, during the Southern Oxidant and Aerosol Study (SOAS) in summer 2013. Six organosulfates were quantified using hydrophilic interaction liquid chromatography (HILIC) with triple quadrupole mass spectrometry (TQD) against authentic standards. Among these, the three most abundant species were glycolic acid sulfate (0.5-52.5 ng m-3), lactic acid sulfate (0.5-36.7 ng m-3), and hydroxyacetone sulfate (0.5-14.3 ng m-3). These three species were strongly inter-correlated, suggesting similar precursors and/or formation pathways. Further correlations with sulfate, isoprene, and isoprene oxidation products indicate important roles for these precursors in organosulfate formation in Centreville. Positive filter sampling artifacts associated with these organosulfates due to gas adsorption or reaction of gas phase precursors of organosulfates with sulfuric acid were assessed for a subset of samples and were less than 7.8 % of their PM2.5 concentrations. Together, the quantified organosulfates accounted for quantitative analysis was employed by way of monitoring characteristic product ions of organosulfates (HSO4- at m/z 97 and SO4- ṡ at m/z 96) and evaluating relative signal strength by HILIC-TQD. Molecular formulas of organosulfates were determined by high-resolution time-of-flight (TOF) mass spectrometry. The major organosulfate signal across all samples corresponded to 2-methyltetrol sulfates, which accounted for 42-62 % of the total bisulfate ion signal. Conversely, glycolic acid sulfate, the most abundant organosulfate quantified in this study, was 0.13-0.57 % of the total bisulfate ion signal. Precursors of m/z 96 mainly consisted of nitro-oxy organosulfates. Organosulfates

  7. Collocations and collocation types in ESP textbooks: Quantitative pedagogical analysis

    Directory of Open Access Journals (Sweden)

    Bogdanović Vesna Ž.

    2016-01-01

    Full Text Available The term collocation, even though it is rather common in the English language grammar, it is not a well known or commonly used term in the textbooks and scientific papers written in the Serbian language. Collocating is usually defined as a natural appearance of two (or more words, which are usually one next to another even though they can be separated in the text, while collocations are defined as words with natural semantic and/or syntactic relations being joined together in a sentence. Collocations are naturally used in all English written texts, including scientific texts and papers. Using two textbooks for English for Specific Purposes (ESP for intermediate students' courses, this paper presents the frequency of collocations and their typology. The paper tries to investigate the relationship between lexical and grammatical collocations written in the ESP texts and the reasons for their presence. There is an overview of the most used subtypes of lexical collocations as well. Furthermore, on applying the basic corpus analysis based on the quantitative analysis, the paper presents the number of open, restricted and bound collocations in ESP texts, trying to draw conclusions on their frequency and hence the modes for their learning. There is also a section related to the number and usage of scientific collocations, both common scientific and narrow-professional ones. The conclusion is that the number of present collocations in the selected two textbooks imposes a demand for further analysis of these lexical connections, as well as new modes for their teaching and presentations to the English learning students.

  8. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  9. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    Science.gov (United States)

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  10. Effects of Noninhibitory Serpin Maspin on the Actin Cytoskeleton: A Quantitative Image Modeling Approach.

    Science.gov (United States)

    Al-Mamun, Mohammed; Ravenhill, Lorna; Srisukkham, Worawut; Hossain, Alamgir; Fall, Charles; Ellis, Vincent; Bass, Rosemary

    2016-04-01

    Recent developments in quantitative image analysis allow us to interrogate confocal microscopy images to answer biological questions. Clumped and layered cell nuclei and cytoplasm in confocal images challenges the ability to identify subcellular compartments. To date, there is no perfect image analysis method to identify cytoskeletal changes in confocal images. Here, we present a multidisciplinary study where an image analysis model was developed to allow quantitative measurements of changes in the cytoskeleton of cells with different maspin exposure. Maspin, a noninhibitory serpin influences cell migration, adhesion, invasion, proliferation, and apoptosis in ways that are consistent with its identification as a tumor metastasis suppressor. Using different cell types, we tested the hypothesis that reduction in cell migration by maspin would be reflected in the architecture of the actin cytoskeleton. A hybrid marker-controlled watershed segmentation technique was used to segment the nuclei, cytoplasm, and ruffling regions before measuring cytoskeletal changes. This was informed by immunohistochemical staining of cells transfected stably or transiently with maspin proteins, or with added bioactive peptides or protein. Image analysis results showed that the effects of maspin were mirrored by effects on cell architecture, in a way that could be described quantitatively.

  11. Software system for X-ray diffraction quantitative phase analysis

    International Nuclear Information System (INIS)

    Fuentes, L.; Herrera, V.; Rubio, E.

    1989-01-01

    A system of experimental methods and computer programs for X-ray diffraction analysis is presented. The most important cases occurring in practice are considered. Program MARIA for spectral analysis is described. The external standard method for powder analysis is presented. Program STANDEXT calculates the sample absorption coefficient, the concentrations and standard deviations by a least squares method. For the case of partly identified samples, the internal standard method is developed. All measured peak are considered in the calculations. Program STANDINT solves the concentrations of the identified phases, their errors and the sample's absorption coefficient. A modification is introduced in the so-called direct method for massive sample analysis. The effect of texture is characterized by model representation of the inverse pole figure associated to the sample diffracting surface. Program DIREC is proposed for titting texture-modulated theoretical diffraction patterns to experimental ones, thus calculating phase concentrations and corresponding errors. Examples are given in some applications

  12. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes

    NARCIS (Netherlands)

    Xu, L.F.; Henke, M.; Zhu, J.; Kurth, W.; Buck-Sorlin, G.H.

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a

  13. Quantitative trait locus (QTL) analysis of percentage grains ...

    African Journals Online (AJOL)

    user

    2011-03-28

    Mar 28, 2011 ... Chalkiness is a major concern in rice (Oryza sativa L.) breeding because it is one of the key factors in determining quality and price, which is a complicated quantitative trait and controlled by maternal, endosperm and cytoplasmic effects. In this study, we conducted grain chalkiness percentage quantitative ...

  14. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  15. Quantitative Model for Supply Chain Visibility: Process Capability Perspective

    Directory of Open Access Journals (Sweden)

    Youngsu Lee

    2016-01-01

    Full Text Available Currently, the intensity of enterprise competition has increased as a result of a greater diversity of customer needs as well as the persistence of a long-term recession. The results of competition are becoming severe enough to determine the survival of company. To survive global competition, each firm must focus on achieving innovation excellence and operational excellence as core competency for sustainable competitive advantage. Supply chain management is now regarded as one of the most effective innovation initiatives to achieve operational excellence, and its importance has become ever more apparent. However, few companies effectively manage their supply chains, and the greatest difficulty is in achieving supply chain visibility. Many companies still suffer from a lack of visibility, and in spite of extensive research and the availability of modern technologies, the concepts and quantification methods to increase supply chain visibility are still ambiguous. Based on the extant researches in supply chain visibility, this study proposes an extended visibility concept focusing on a process capability perspective and suggests a more quantitative model using Z score in Six Sigma methodology to evaluate and improve the level of supply chain visibility.

  16. Photographers’ Nomenclature Units: A Structural and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Margarita A. Mihailova

    2017-11-01

    Full Text Available Addressing the needs of cross and intercultural communication as well as the methodology of contrastive research, the paper presents the results of the complex analysis conducted to describe semantic and pragmatic parameters of nomenclature units denoting photography equipment in the modern Russian informal discourse of professional photographers. The research is exemplified by 34 original nomenclature units and their 34 Russian equivalents used in 6871 comments posted at “Клуб.Foto.ru” web-site in 2015. The structural and quantitative analyses of photographers’ nomenclature demonstrate the users’ morphological and graphic preferences and indirectly reflect their social and professional values. The corpus-based approach developed by Kast-Aigner (2009: 141 was applied in the study with the aim to identify the nomenclature units denoting photography equipment, validate and elaborate the data of the existing corpus. The research also throws light on the problems of professional language development and derivational processes. The perspective of the study lies in the research of the broader context of professional nomenclature.

  17. Quantitative produced water analysis using mobile 1H NMR

    Science.gov (United States)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  18. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  19. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  20. Quantitative analysis of time-resolved microwave conductivity data

    Science.gov (United States)

    Reid, Obadiah G.; Moore, David T.; Li, Zhen; Zhao, Dewei; Yan, Yanfa; Zhu, Kai; Rumbles, Garry

    2017-12-01

    Flash-photolysis time-resolved microwave conductivity (fp-TRMC) is a versatile, highly sensitive technique for studying the complex photoconductivity of solution, solid, and gas-phase samples. The purpose of this paper is to provide a standard reference work for experimentalists interested in using microwave conductivity methods to study functional electronic materials, describing how to conduct and calibrate these experiments in order to obtain quantitative results. The main focus of the paper is on calculating the calibration factor, K, which is used to connect the measured change in microwave power absorption to the conductance of the sample. We describe the standard analytical formulae that have been used in the past, and compare them to numerical simulations. This comparison shows that the most widely used analytical analysis of fp-TRMC data systematically under-estimates the transient conductivity by ~60%. We suggest a more accurate semi-empirical way of calibrating these experiments. However, we emphasize that the full numerical calculation is necessary to quantify both transient and steady-state conductance for arbitrary sample properties and geometry.

  1. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    Science.gov (United States)

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  2. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  3. Quantitative analysis of histopathological findings using image processing software.

    Science.gov (United States)

    Horai, Yasushi; Kakimoto, Tetsuhiro; Takemoto, Kana; Tanaka, Masaharu

    2017-10-01

    In evaluating pathological changes in drug efficacy and toxicity studies, morphometric analysis can be quite robust. In this experiment, we examined whether morphometric changes of major pathological findings in various tissue specimens stained with hematoxylin and eosin could be recognized and quantified using image processing software. Using Tissue Studio, hypertrophy of hepatocytes and adrenocortical cells could be quantified based on the method of a previous report, but the regions of red pulp, white pulp, and marginal zones in the spleen could not be recognized when using one setting condition. Using Image-Pro Plus, lipid-derived vacuoles in the liver and mucin-derived vacuoles in the intestinal mucosa could be quantified using two criteria (area and/or roundness). Vacuoles derived from phospholipid could not be quantified when small lipid deposition coexisted in the liver and adrenal cortex. Mononuclear inflammatory cell infiltration in the liver could be quantified to some extent, except for specimens with many clustered infiltrating cells. Adipocyte size and the mean linear intercept could be quantified easily and efficiently using morphological processing and the macro tool equipped in Image-Pro Plus. These methodologies are expected to form a base system that can recognize morphometric features and analyze quantitatively pathological findings through the use of information technology.

  4. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  5. Social media in epilepsy: A quantitative and qualitative analysis.

    Science.gov (United States)

    Meng, Ying; Elkaim, Lior; Wang, Justin; Liu, Jessica; Alotaibi, Naif M; Ibrahim, George M; Fallah, Aria; Weil, Alexander G; Valiante, Taufik A; Lozano, Andres M; Rutka, James T

    2017-06-01

    While the social burden of epilepsy has been extensively studied, an evaluation of social media related to epilepsy may provide novel insight into disease perception, patient needs and access to treatments. The objective of this study is to assess patterns in social media and online communication usage related to epilepsy and its associated topics. We searched two major social media platforms (Facebook and Twitter) for public accounts dedicated to epilepsy. Results were analyzed using qualitative and quantitative methodologies. The former involved thematic and word count analysis for online posts and tweets on these platforms, while the latter employed descriptive statistics and non-parametric tests. Facebook had a higher number of pages (840 accounts) and users (3 million) compared to Twitter (137 accounts and 274,663 users). Foundation and support groups comprised most of the accounts and users on both Facebook and Twitter. The number of accounts increased by 100% from 2012 to 2016. Among the 403 posts and tweets analyzed, "providing information" on medications or correcting common misconceptions in epilepsy was the most common theme (48%). Surgical interventions for epilepsy were only mentioned in 1% of all posts and tweets. The current study provides a comprehensive reference on the usage of social media in epilepsy. The number of online users interested in epilepsy is likely the highest among all neurological conditions. Surgery, as a method of treating refractory epilepsy, however, could be underrepresented on social media. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Automated quantitative analysis of coordinated locomotor behaviour in rats.

    Science.gov (United States)

    Tanger, H J; Vanwersch, R A; Wolthuis, O L

    1984-03-01

    Disturbances of motor coordination are usually difficult to quantify. Therefore, a method was developed for the automated quantitative analysis of the movements of the dyed paws of stepping rats, registered by a colour TV camera. The signals from the TV-video system were converted by an electronic interface into voltages proportional to the X- and Y-coordinates of the paws, from which a desktop computer calculated the movements of these paws in time and distance. Application 1 analysed the steps of a rat walking in a hollow rotating wheel. The results showed low variability of the walking pattern, the method was insensitive to low doses of alcohol, but was suitable to quantify overt, e.g. neurotoxic, locomotor disturbances or recovery thereof. In application 2 hurdles were placed in a similar hollow wheel and the rats were trained to step from the top of one hurdle to another. Physostigmine-induced disturbances of this acquired complex motor task could be detected at doses far below those that cause overt symptoms.

  7. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Prahl, P.; Weeke, B.; Loewenstein, H.

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10 4 , 2 x 10 4 , 2 x 10 5 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  8. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...

  9. B1 -sensitivity analysis of quantitative magnetization transfer imaging.

    Science.gov (United States)

    Boudreau, Mathieu; Stikov, Nikola; Pike, G Bruce

    2018-01-01

    To evaluate the sensitivity of quantitative magnetization transfer (qMT) fitted parameters to B 1 inaccuracies, focusing on the difference between two categories of T 1 mapping techniques: B 1 -independent and B 1 -dependent. The B 1 -sensitivity of qMT was investigated and compared using two T 1 measurement methods: inversion recovery (IR) (B 1 -independent) and variable flip angle (VFA), B 1 -dependent). The study was separated into four stages: 1) numerical simulations, 2) sensitivity analysis of the Z-spectra, 3) healthy subjects at 3T, and 4) comparison using three different B 1 imaging techniques. For typical B 1 variations in the brain at 3T (±30%), the simulations resulted in errors of the pool-size ratio (F) ranging from -3% to 7% for VFA, and -40% to > 100% for IR, agreeing with the Z-spectra sensitivity analysis. In healthy subjects, pooled whole-brain Pearson correlation coefficients for F (comparing measured double angle and nominal flip angle B 1 maps) were ρ = 0.97/0.81 for VFA/IR. This work describes the B 1 -sensitivity characteristics of qMT, demonstrating that it varies substantially on the B 1 -dependency of the T 1 mapping method. Particularly, the pool-size ratio is more robust against B 1 inaccuracies if VFA T 1 mapping is used, so much so that B 1 mapping could be omitted without substantially biasing F. Magn Reson Med 79:276-285, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  10. QSAR DataBank repository: open and linked qualitative and quantitative structure-activity relationship models.

    Science.gov (United States)

    Ruusmann, V; Sild, S; Maran, U

    2015-01-01

    Structure-activity relationship models have been used to gain insight into chemical and physical processes in biomedicine, toxicology, biotechnology, etc. for almost a century. They have been recognized as valuable tools in decision support workflows for qualitative and quantitative predictions. The main obstacle preventing broader adoption of quantitative structure-activity relationships [(Q)SARs] is that published models are still relatively difficult to discover, retrieve and redeploy in a modern computer-oriented environment. This publication describes a digital repository that makes in silico (Q)SAR-type descriptive and predictive models archivable, citable and usable in a novel way for most common research and applied science purposes. The QSAR DataBank (QsarDB) repository aims to make the processes and outcomes of in silico modelling work transparent, reproducible and accessible. Briefly, the models are represented in the QsarDB data format and stored in a content-aware repository (a.k.a. smart repository). Content awareness has two dimensions. First, models are organized into collections and then into collection hierarchies based on their metadata. Second, the repository is not only an environment for browsing and downloading models (the QDB archive) but also offers integrated services, such as model analysis and visualization and prediction making. The QsarDB repository unlocks the potential of descriptive and predictive in silico (Q)SAR-type models by allowing new and different types of collaboration between model developers and model users. The key enabling factor is the representation of (Q)SAR models in the QsarDB data format, which makes it easy to preserve and share all relevant data, information and knowledge. Model developers can become more productive by effectively reusing prior art. Model users can make more confident decisions by relying on supporting information that is larger and more diverse than before. Furthermore, the smart repository

  11. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  12. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  13. Quantitative analysis of uncertainty from pebble flow in HTR

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hao, E-mail: haochen.heu@163.com [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China); Fu, Li; Jiong, Guo; Ximing, Sun; Lidong, Wang [Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China)

    2015-12-15

    Highlights: • An uncertainty and sensitivity analysis model for pebble flow has been built. • Experiment and random walk theory are used to identify uncertainty of pebble flow. • Effects of pebble flow to the core parameters are identified by sensitivity analysis. • Uncertainty of core parameters due to pebble flow is quantified for the first time. - Abstract: In pebble bed HTR, along the deterministic average flow lines, randomness exists in the flow of pebbles, which is not possible to simulate with the current reactor design codes for HTR, such as VSOP, due to the limitation of current computer capability. In order to study how the randomness of pebble flow will affect the key parameters in HTR, a new pebble flow model was set up, which has been successfully transplanted into the VSOP code. In the new pebble flow model, mixing coefficients were introduced into the fixed flow line to simulate the randomness of pebble flow. Numerical simulation and pebble flow experiments were facilitated to determine the mixing coefficients. Sensitivity analysis was conducted to achieve the conclusion that the key parameters of pebble bed HTR are not sensitive to the randomness in pebble flow. The uncertainty of maximum power density and power distribution caused by the randomness in pebble flow is very small, especially for the “multi-pass” scheme of fuel circulation adopted in the pebble bed HTR.

  14. [Multiple dependent variables LS-SVM regression algorithm and its application in NIR spectral quantitative analysis].

    Science.gov (United States)

    An, Xin; Xu, Shuo; Zhang, Lu-Da; Su, Shi-Guang

    2009-01-01

    In the present paper, on the basis of LS-SVM algorithm, we built a multiple dependent variables LS-SVM (MLS-SVM) regression model whose weights can be optimized, and gave the corresponding algorithm. Furthermore, we theoretically explained the relationship between MLS-SVM and LS-SVM. Sixty four broomcorn samples were taken as experimental material, and the sample ratio of modeling set to predicting set was 51 : 13. We first selected randomly and uniformly five weight groups in the interval [0, 1], and then in the way of leave-one-out (LOO) rule determined one appropriate weight group and parameters including penalizing parameters and kernel parameters in the model according to the criterion of the minimum of average relative error. Then a multiple dependent variables quantitative analysis model was built with NIR spectrum and simultaneously analyzed three chemical constituents containing protein, lysine and starch. Finally, the average relative errors between actual values and predicted ones by the model of three components for the predicting set were 1.65%, 6.47% and 1.37%, respectively, and the correlation coefficients were 0.9940, 0.8392 and 0.8825, respectively. For comparison, LS-SVM was also utilized, for which the average relative errors were 1.68%, 6.25% and 1.47%, respectively, and the correlation coefficients were 0.9941, 0.8310 and 0.8800, respectively. It is obvious that MLS-SVM algorithm is comparable to LS-SVM algorithm in modeling analysis performance, and both of them can give satisfying results. The result shows that the model with MLS-SVM algorithm is capable of doing multi-components NIR quantitative analysis synchronously. Thus MLS-SVM algorithm offers a new multiple dependent variables quantitative analysis approach for chemometrics. In addition, the weights have certain effect on the prediction performance of the model with MLS-SVM, which is consistent with our intuition and is validated in this study. Therefore, it is necessary to optimize

  15. Incorporation of caffeine into a quantitative model of fatigue and sleep.

    Science.gov (United States)

    Puckeridge, M; Fulcher, B D; Phillips, A J K; Robinson, P A

    2011-03-21

    A recent physiologically based model of human sleep is extended to incorporate the effects of caffeine on sleep-wake timing and fatigue. The model includes the sleep-active neurons of the hypothalamic ventrolateral preoptic area (VLPO), the wake-active monoaminergic brainstem populations (MA), their interactions with cholinergic/orexinergic (ACh/Orx) input to MA, and circadian and homeostatic drives. We model two effects of caffeine on the brain due to competitive antagonism of adenosine (Ad): (i) a reduction in the homeostatic drive and (ii) an increase in cholinergic activity. By comparing the model output to experimental data, constraints are determined on the parameters that describe the action of caffeine on the brain. In accord with experiment, the ranges of these parameters imply significant variability in caffeine sensitivity between individuals, with caffeine's effectiveness in reducing fatigue being highly dependent on an individual's tolerance, and past caffeine and sleep history. Although there are wide individual differences in caffeine sensitivity and thus in parameter values, once the model is calibrated for an individual it can be used to make quantitative predictions for that individual. A number of applications of the model are examined, using exemplar parameter values, including: (i) quantitative estimation of the sleep loss and the delay to sleep onset after taking caffeine for various doses and times; (ii) an analysis of the system's stable states showing that the wake state during sleep deprivation is stabilized after taking caffeine; and (iii) comparing model output successfully to experimental values of subjective fatigue reported in a total sleep deprivation study examining the reduction of fatigue with caffeine. This model provides a framework for quantitatively assessing optimal strategies for using caffeine, on an individual basis, to maintain performance during sleep deprivation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Quantitative analysis of LISA pathfinder test-mass noise

    International Nuclear Information System (INIS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-01-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3x10 -14 m s -2 /√(Hz) at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise

  17. Patient-specific coronary blood supply territories for quantitative perfusion analysis.

    Science.gov (United States)

    Zakkaroff, Constantine; Biglands, John D; Greenwood, John P; Plein, Sven; Boyle, Roger D; Radjenovic, Aleksandra; Magee, Derek R

    2018-01-01

    Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis.

  18. Patient-specific coronary blood supply territories for quantitative perfusion analysis

    Science.gov (United States)

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.

    2018-01-01

    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  19. A Transformative Model for Undergraduate Quantitative Biology Education

    OpenAIRE

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematic...

  20. A theoretical quantitative model for evolution of cancer chemotherapy resistance

    Directory of Open Access Journals (Sweden)

    Gatenby Robert A

    2010-04-01

    Full Text Available Abstract Background Disseminated cancer remains a nearly uniformly fatal disease. While a number of effective chemotherapies are available, tumors inevitably evolve resistance to these drugs ultimately resulting in treatment failure and cancer progression. Causes for chemotherapy failure in cancer treatment reside in multiple levels: poor vascularization, hypoxia, intratumoral high interstitial fluid pressure, and phenotypic resistance to drug-induced toxicity through upregulated xenobiotic metabolism or DNA repair mechanisms and silencing of apoptotic pathways. We propose that in order to understand the evolutionary dynamics that allow tumors to develop chemoresistance, a comprehensive quantitative model must be used to describe the interactions of cell resistance mechanisms and tumor microenvironment during chemotherapy. Ultimately, the purpose of this model is to identify the best strategies to treat different types of tumor (tumor microenvironment, genetic/phenotypic tumor heterogeneity, tumor growth rate, etc.. We predict that the most promising strategies are those that are both cytotoxic and apply a selective pressure for a phenotype that is less fit than that of the original cancer population. This strategy, known as double bind, is different from the selection process imposed by standard chemotherapy, which tends to produce a resistant population that simply upregulates xenobiotic metabolism. In order to achieve this goal we propose to simulate different tumor progression and therapy strategies (chemotherapy and glucose restriction targeting stabilization of tumor size and minimization of chemoresistance. Results This work confirms the prediction of previous mathematical models and simulations that suggested that administration of chemotherapy with the goal of tumor stabilization instead of eradication would yield better results (longer subject survival than the use of maximum tolerated doses. Our simulations also indicate that the

  1. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  2. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    Science.gov (United States)

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (pquantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  3. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  4. Quantitative analysis of the individual dynamics of Psychology theses

    Directory of Open Access Journals (Sweden)

    Robles, Jaime R.

    2009-12-01

    Full Text Available Three cohorts of undergraduate psychology theses (n = 57 performed by last year undergraduate psychology students from Universidad Católica Andrés Bello, were monitored using 5 longitudinal measurements of progression. A Generalized Additive Model, to predict the completion time of the theses, is tested against two completion times: early and delayed. Effect size measures favor a multiple dimension model over a global progress model. The trajectory of the indicators through the 5 measurements allows the differentiation between early and delayed completion. The completion probabilities estimated by the dimensional model allow the identification of differential oscillation levels for the distinct completion times. The initial progression indicators allow the prediction of early completion with a 71% success rate, while the final measurement shows a success rate of 89%. The results support the effectiveness of the supervisory system and the analysis of the progression dynamics of the theses from a task-delay model, focused on the relationship between the amount of task completion and the deadlines

  5. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  6. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  7. Herd immunity and pneumococcal conjugate vaccine: a quantitative model.

    Science.gov (United States)

    Haber, Michael; Barskey, Albert; Baughman, Wendy; Barker, Lawrence; Whitney, Cynthia G; Shaw, Kate M; Orenstein, Walter; Stephens, David S

    2007-07-20

    Invasive pneumococcal disease in older children and adults declined markedly after introduction in 2000 of the pneumococcal conjugate vaccine for young children. An empirical quantitative model was developed to estimate the herd (indirect) effects on the incidence of invasive disease among persons >or=5 years of age induced by vaccination of young children with 1, 2, or >or=3 doses of the pneumococcal conjugate vaccine, Prevnar (PCV7), containing serotypes 4, 6B, 9V, 14, 18C, 19F and 23F. From 1994 to 2003, cases of invasive pneumococcal disease were prospectively identified in Georgia Health District-3 (eight metropolitan Atlanta counties) by Active Bacterial Core surveillance (ABCs). From 2000 to 2003, vaccine coverage levels of PCV7 for children aged 19-35 months in Fulton and DeKalb counties (of Atlanta) were estimated from the National Immunization Survey (NIS). Based on incidence data and the estimated average number of doses received by 15 months of age, a Poisson regression model was fit, describing the trend in invasive pneumococcal disease in groups not targeted for vaccination (i.e., adults and older children) before and after the introduction of PCV7. Highly significant declines in all the serotypes contained in PCV7 in all unvaccinated populations (5-19, 20-39, 40-64, and >64 years) from 2000 to 2003 were found under the model. No significant change in incidence was seen from 1994 to 1999, indicating rates were stable prior to vaccine introduction. Among unvaccinated persons 5+ years of age, the modeled incidence of disease caused by PCV7 serotypes as a group dropped 38.4%, 62.0%, and 76.6% for 1, 2, and 3 doses, respectively, received on average by the population of children by the time they are 15 months of age. Incidence of serotypes 14 and 23F had consistent significant declines in all unvaccinated age groups. In contrast, the herd immunity effects on vaccine-related serotype 6A incidence were inconsistent. Increasing trends of non

  8. Quantitative genetic analysis of anxiety trait in bipolar disorder.

    Science.gov (United States)

    Contreras, J; Hare, E; Chavarría, G; Raventós, H

    2018-01-01

    Bipolar disorder type I (BPI) affects approximately 1% of the world population. Although genetic influences on bipolar disorder are well established, identification of genes that predispose to the illness has been difficult. Most genetic studies are based on categorical diagnosis. One strategy to overcome this obstacle is the use of quantitative endophenotypes, as has been done for other medical disorders. We studied 619 individuals, 568 participants from 61 extended families and 51 unrelated healthy controls. The sample was 55% female and had a mean age of 43.25 (SD 13.90; range 18-78). Heritability and genetic correlation of the trait scale from the Anxiety State and Trait Inventory (STAI) was computed by using the general linear model (SOLAR package software). we observed that anxiety trait meets the following criteria for an endophenotype of bipolar disorder type I (BPI): 1) association with BPI (individuals with BPI showed the highest trait score (F = 15.20 [5,24], p = 0.009), 2) state-independence confirmed after conducting a test-retest in 321 subjects, 3) co-segregation within families 4) heritability of 0.70 (SE: 0.060), p = 2.33 × 10 -14 and 5) genetic correlation with BPI was 0.20, (SE = 0.17, p = 3.12 × 10 -5 ). Confounding factors such as comorbid disorders and pharmacological treatment could affect the clinical relationship between BPI and anxiety trait. Further research is needed to evaluate if anxiety traits are specially related to BPI in comparison with other traits such as anger, attention or response inhibition deficit, pathological impulsivity or low self-directedness. Anxiety trait is a heritable phenotype that follows a normal distribution when measured not only in subjects with BPI but also in unrelated healthy controls. It could be used as an endophenotype in BPI for the identification of genomic regions with susceptibility genes for this disorder. Published by Elsevier B.V.

  9. [Quantitative analysis of thiram by surface-enhanced raman spectroscopy combined with feature extraction Algorithms].

    Science.gov (United States)

    Zhang, Bao-hua; Jiang, Yong-cheng; Sha, Wen; Zhang, Xian-yi; Cui, Zhi-feng

    2015-02-01

    Three feature extraction algorithms, such as the principal component analysis (PCA), the discrete cosine transform (DCT) and the non-negative factorization (NMF), were used to extract the main information of the spectral data in order to weaken the influence of the spectral fluctuation on the subsequent quantitative analysis results based on the SERS spectra of the pesticide thiram. Then the extracted components were respectively combined with the linear regression algorithm--the partial least square regression (PLSR) and the non-linear regression algorithm--the support vector machine regression (SVR) to develop the quantitative analysis models. Finally, the effect of the different feature extraction algorithms on the different kinds of the regression algorithms was evaluated by using 5-fold cross-validation method. The experiments demonstrate that the analysis results of SVR are better than PLSR for the non-linear relationship between the intensity of the SERS spectrum and the concentration of the analyte. Further, the feature extraction algorithms can significantly improve the analysis results regardless of the regression algorithms which mainly due to extracting the main information of the source spectral data and eliminating the fluctuation. Additionally, PCA performs best on the linear regression model and NMF is best on the non-linear model, and the predictive error can be reduced nearly three times in the best case. The root mean square error of cross-validation of the best regression model (NMF+SVR) is 0.0455 micormol x L(-1) (10(-6) mol x L(-1)), and it attains the national detection limit of thiram, so the method in this study provides a novel method for the fast detection of thiram. In conclusion, the study provides the experimental references the selecting the feature extraction algorithms on the analysis of the SERS spectrum, and some common findings of feature extraction can also help processing of other kinds of spectroscopy.

  10. Risco privado em infra-estrutura pública: uma análise quantitativa de risco como ferramenta de modelagem de contratos Private risk in public infrastructure: a quantitative risk analysis as a contract modeling tool

    Directory of Open Access Journals (Sweden)

    Luiz E. T. Brandão

    2007-12-01

    Full Text Available Parcerias público-privadas (PPP são arranjos contratuais onde o governo assume compromissos futuros por meio de garantias e opções. São alternativas para aumentar a eficiência do Estado por uma alocação mais eficiente de incentivos e riscos. No entanto, a determinação do nível ótimo de garantias e a própria alocação de riscos são geralmente realizadas de forma subjetiva, podendo levar o governo a ter que assumir passivos significativos. Este artigo propõe um modelo de valoração quantitativa de garantias governamentais em projetos de PPP por meio da metodologia das opções reais, e este modelo é aplicado a um projeto de concessão rodoviária. Os autores analisam o impacto de diversos níveis de garantia de receita sobre o valor e risco do projeto, bem como o valor esperado do desembolso futuro do governo em cada uma das situações, concluindo que é possível ao poder público determinar o nível ótimo de garantia em função do grau de redução de risco desejado, e que o desenho e a modelagem contratual de projetos de PPP podem se beneficiar de ferramentas quantitativas aqui apresentadas.Public private partnerships (PPP are contractual arrangements in which the government assumes future obligations by providing project guarantees. They are considered a way of increasing government efficiency through a more efficient allocation of risks and incentives. On the other hand, the assessment and determination the optimal level of these guarantees is usually subjective, exposing the government to potentially high future liabilities. This article proposes a quantitative model for the evaluation of government guarantees in PPP projects under the real options approach, and applies this model to a toll highway concession with a minimum revenue guarantee. It studies the impact of different guarantee levels on the value and the risk of the project, as well as the expected level of future cash payments to be made by the government in

  11. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    Science.gov (United States)

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  12. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Science.gov (United States)

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  13. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Directory of Open Access Journals (Sweden)

    Aino eSalminen

    2015-10-01

    Full Text Available Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary A. actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥ 6 mm pockets, and alveolar bone loss (ABL. High level of T. forsythia was associated also with bleeding on probing (BOP. The combination of the four bacteria, i.e. the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR of 2.40 (95% CI 1.39–4.13. When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52. The highest odds ratio 3.59 (95% CI 1.94–6.63 was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary

  14. The place of quantitative energy models in a prospective approach

    International Nuclear Information System (INIS)

    Taverdet-Popiolek, N.

    2009-01-01

    Futurology above all depends on having the right mind set. Gaston Berger summarizes the prospective approach in 5 five main thrusts: prepare for the distant future, be open-minded (have a systems and multidisciplinary approach), carry out in-depth analyzes (draw out actors which are really determinant or the future, as well as established shed trends), take risks (imagine risky but flexible projects) and finally think about humanity, futurology being a technique at the service of man to help him build a desirable future. On the other hand, forecasting is based on quantified models so as to deduce 'conclusions' about the future. In the field of energy, models are used to draw up scenarios which allow, for instance, measuring medium or long term effects of energy policies on greenhouse gas emissions or global welfare. Scenarios are shaped by the model's inputs (parameters, sets of assumptions) and outputs. Resorting to a model or projecting by scenario is useful in a prospective approach as it ensures coherence for most of the variables that have been identified through systems analysis and that the mind on its own has difficulty to grasp. Interpretation of each scenario must be carried out in the light o the underlying framework of assumptions (the backdrop), developed during the prospective stage. When the horizon is far away (very long-term), the worlds imagined by the futurologist contain breaks (technological, behavioural and organizational) which are hard to integrate into the models. It is here that the main limit for the use of models in futurology is located. (author)

  15. CSML2SBML: a novel tool for converting quantitative biological pathway models from CSML into SBML.

    Science.gov (United States)

    Li, Chen; Nagasaki, Masao; Ikeda, Emi; Sekiya, Yayoi; Miyano, Satoru

    2014-07-01

    CSML and SBML are XML-based model definition standards which are developed with the aim of creating exchange formats for modeling, visualizing and simulating biological pathways. In this article we report a release of a format convertor for quantitative pathway models, namely CSML2SBML. It translates models encoded by CSML into SBML without loss of structural and kinetic information. The simulation and parameter estimation of the resulting SBML model can be carried out with compliant tool CellDesigner for further analysis. The convertor is based on the standards CSML version 3.0 and SBML Level 2 Version 4. In our experiments, 11 out of 15 pathway models in CSML model repository and 228 models in Macrophage Pathway Knowledgebase (MACPAK) are successfully converted to SBML models. The consistency of the resulting model is validated by libSBML Consistency Check of CellDesigner. Furthermore, the converted SBML model assigned with the kinetic parameters translated from CSML model can reproduce the same dynamics with CellDesigner as CSML one running on Cell Illustrator. CSML2SBML, along with its instructions and examples for use are available at http://csml2sbml.csml.org. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  16. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  17. Quantitative mineralogical analysis of small samples of china clay using x ray diffractometry.

    OpenAIRE

    Salt, P D

    1985-01-01

    The quantitative mineralogical analysis of small samples (less than 20 mg) of china clay has been investigated using x ray diffractometry to determine kaolinite, mica, quartz, and feldspar. A method has been developed and applied to the quantitative analysis of airborne dust samples and of other small discrete samples. Determinations were made either on samples after collection on a membrane filter or on samples after deposition from aqueous suspension on to a silver substrate. Quantitative a...

  18. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  19. Comparison of the quantitative analysis performance between pulsed voltage atom probe and pulsed laser atom probe

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, J., E-mail: takahashi.3ct.jun@jp.nssmc.com [Advanced Technology Research Laboratories, Nippon Steel & Sumitomo Metal Corporation, 20-1 Shintomi, Futtsu-city, Chiba 293-8511 (Japan); Kawakami, K. [Advanced Technology Research Laboratories, Nippon Steel & Sumitomo Metal Corporation, 20-1 Shintomi, Futtsu-city, Chiba 293-8511 (Japan); Raabe, D. [Max-Planck Institut für Eisenforschung GmbH, Department for Microstructure Physics and Alloy Design, Max-Planck-Str. 1, 40237 Düsseldorf (Germany)

    2017-04-15

    Highlights: • Quantitative analysis in Fe-Cu alloy was investigated in voltage and laser atom probe. • In voltage-mode, apparent Cu concentration exceeded actual concentration at 20–40 K. • In laser-mode, the concentration never exceeded the actual concentration even at 20 K. • Detection loss was prevented due to the rise in tip surface temperature in laser-mode. • Preferential evaporation of solute Cu was reduced in laser-mode. - Abstract: The difference in quantitative analysis performance between the voltage-mode and laser-mode of a local electrode atom probe (LEAP3000X HR) was investigated using a Fe-Cu binary model alloy. Solute copper atoms in ferritic iron preferentially field evaporate because of their significantly lower evaporation field than the matrix iron, and thus, the apparent concentration of solute copper tends to be lower than the actual concentration. However, in voltage-mode, the apparent concentration was higher than the actual concentration at 40 K or less due to a detection loss of matrix iron, and the concentration decreased with increasing specimen temperature due to the preferential evaporation of solute copper. On the other hand, in laser-mode, the apparent concentration never exceeded the actual concentration, even at lower temperatures (20 K), and this mode showed better quantitative performance over a wide range of specimen temperatures. These results indicate that the pulsed laser atom probe prevents both detection loss and preferential evaporation under a wide range of measurement conditions.

  20. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    Science.gov (United States)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  1. Quantitative analysis of results for quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Passaro, Bruno Martins

    2011-01-01

    The linear accelerators represent the most important, practical and versatile source of ionizing radiation in radiotherapy. These functional characteristics influence the geometric and dosimetric accuracy of therapeutic doses applied to patients. The performance of this equipment may vary due to electronic defects, component failures or mechanical breakdowns, or may vary due to the deterioration and aging of components. Maintaining the quality of care depends on the stability of the accelerators and quality control of the institutions to monitor deviations in the parameters of the beam. The aim of this study is to assess and analyze the stability of the calibration factor of linear accelerators, as well as the other dosimetric parameters normally included in a program of quality control in radiotherapy. The average calibration factors of the accelerators for the period of approximately four years for the Clinac 600C and Clinac 6EX were (0,998 ± 0,012) and (0,996 ± 0,014), respectively. For the Clinac 2100CD 6 MV and 15 MV was (1,008 ± 0,009) and (1,006 ± 0,010), respectively, in a period of approximately four years. Statistical analysis of the three linear accelerators was found that the coefficient of variation of calibration factors had values below 2% which shows a consistency in the data. By calculating the normal distribution of calibration factors, we found that for the Clinac 600C and Clinac 2100CD, is an expected probability that more than 90% of cases the values are within acceptable limits according to the TG-142, while for the Clinac 6EX is expected around 85% since this had several exchanges of accelerator components. The values of TPR 20,10 of three accelerators are practically constant and within acceptable limits according to the TG-142. It can be concluded that a detailed study of data from the calibration factor of the accelerators and TPR20,10 from a quantitative point of view, is extremely useful in a quality assurance program. (author)

  2. Development and Validation of Quantitative Structure-Activity Relationship Models for Compounds Acting on Serotoninergic Receptors

    Directory of Open Access Journals (Sweden)

    Grażyna Żydek

    2012-01-01

    Full Text Available A quantitative structure-activity relationship (QSAR study has been made on 20 compounds with serotonin (5-HT receptor affinity. Thin-layer chromatographic (TLC data and physicochemical parameters were applied in this study. RP2 TLC 60F254 plates (silanized impregnated with solutions of propionic acid, ethylbenzene, 4-ethylphenol, and propionamide (used as analogues of the key receptor amino acids and their mixtures (denoted as S1–S7 biochromatographic models were used in two developing phases as a model of drug-5-HT receptor interaction. The semiempirical method AM1 (HyperChem v. 7.0 program and ACD/Labs v. 8.0 program were employed to calculate a set of physicochemical parameters for the investigated compounds. Correlation and multiple linear regression analysis were used to search for the best QSAR equations. The correlations obtained for the compounds studied represent their interactions with the proposed biochromatographic models. The good multivariate relationships (R2=0.78–0.84 obtained by means of regression analysis can be used for predicting the quantitative effect of biological activity of different compounds with 5-HT receptor affinity. “Leave-one-out” (LOO and “leave-N-out” (LNO cross-validation methods were used to judge the predictive power of final regression equations.

  3. Machine learning methods for quantitative analysis of Raman spectroscopy data

    Science.gov (United States)

    Madden, Michael G.; Ryder, Alan G.

    2003-03-01

    The automated identification and quantification of illicit materials using Raman spectroscopy is of significant importance for law enforcement agencies. This paper explores the use of Machine Learning (ML) methods in comparison with standard statistical regression techniques for developing automated identification methods. In this work, the ML task is broken into two sub-tasks, data reduction and prediction. In well-conditioned data, the number of samples should be much larger than the number of attributes per sample, to limit the degrees of freedom in predictive models. In this spectroscopy data, the opposite is normally true. Predictive models based on such data have a high number of degrees of freedom, which increases the risk of models over-fitting to the sample data and having poor predictive power. In the work described here, an approach to data reduction based on Genetic Algorithms is described. For the prediction sub-task, the objective is to estimate the concentration of a component in a mixture, based on its Raman spectrum and the known concentrations of previously seen mixtures. Here, Neural Networks and k-Nearest Neighbours are used for prediction. Preliminary results are presented for the problem of estimating the concentration of cocaine in solid mixtures, and compared with previously published results in which statistical analysis of the same dataset was performed. Finally, this paper demonstrates how more accurate results may be achieved by using an ensemble of prediction techniques.

  4. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    Science.gov (United States)

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    forecasting of quantitative snowfall at 10 meteoro- logical stations in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. At these stations of Snow and Avalanche Study Estab- lishment (SASE), snow and meteorological data are recorded twice daily at 08:30 and 17:30 hrs since more than last four decades ...

  6. A Transformative Model for Undergraduate Quantitative Biology Education

    Science.gov (United States)

    Usher, David C.; Driscoll, Tobin A.; Dhurjati, Prasad; Pelesko, John A.; Rossi, Louis F.; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B.

    2010-01-01

    The "BIO2010" report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3)…

  7. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  8. Quantitative Analysis and Design of a Rudder Roll Damping Controller

    DEFF Research Database (Denmark)

    Hearns, G.; Blanke, M.

    1998-01-01

    A rudder roll damping controller is designed using Quantitative feedback theory to be robust for changes in the ships metacentric height. The analytical constraint due to the non-minimum phase behaviour of the rudder to roll is analysed using the Poisson Integral Formula and it is shown how...

  9. Quantitative trait loci analysis of individual and total isoflavone ...

    Indian Academy of Sciences (India)

    Soybean isoflavones play diverse roles in human health, including cancers, osteoporosis, heart disease, menopausal symptoms and pabulums. The objective of this study was to identify the quantitative trait loci (QTL) associated with the isoflavones daidzein (DC), genistein (GeC), glycitein (GlC) and total isoflavone ...

  10. Quantitative analysis of soluble elements in environmental waters by PIXE

    International Nuclear Information System (INIS)

    Niizeki, T.; Kawasaki, K.; Adachi, M.; Tsuji, M.; Hattori, T.

    1999-01-01

    We have started PIXE research for environmental science at Van de Graaff accelerator facility in Tokyo Institute of Technology. Quantitative measurements of soluble fractions in river waters have been carried out using the preconcentrate method developed in Tohoku University. We reveal that this PIXE target preparation can be also applied to waste water samples. (author)

  11. Multiparent intercross populations in analysis of quantitative traits

    Indian Academy of Sciences (India)

    2012-04-17

    Apr 17, 2012 ... these techniques can examine epistasis and interactions of. QTL with genetic background, which underline quantitative traits. Second generation mapping resources. The second generation mapping resources propose to address many of the limitations associated with conventional map- ping populations.

  12. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  13. Quantitative morphometrical analysis of a North African population ...

    Indian Academy of Sciences (India)

    2008-12-23

    Dec 23, 2008 ... Genetic variability of quantitative traits was investigated in a Moroccan population of Drosophila melanogaster, with an isofe- male line design. Results were compared with data previously obtained from French populations. Although the environmental and thermal conditions are very different in France and ...

  14. Quantitative morphometrical analysis of a North African population ...

    Indian Academy of Sciences (India)

    Genetic variability of quantitative traits was investigated in aMoroccan population of Drosophila melanogaster, with an isofemale line design. Results were compared with data previously obtained from French populations. Although the environmental and thermal conditions are very different in France and Morocco, only two ...

  15. Quantitative trait loci analysis of individual and total isoflavone ...

    Indian Academy of Sciences (India)

    2014-08-19

    Aug 19, 2014 ... Abstract. Soybean isoflavones play diverse roles in human health, including cancers, osteoporosis, heart disease, menopausal symptoms and pabulums. The objective of this study was to identify the quantitative trait loci (QTL) associated with the isoflavones daidzein (DC), genistein (GeC), glycitein (GlC) ...

  16. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    SAM

    2014-05-14

    May 14, 2014 ... African Journal of Biotechnology. Full Length Research ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle ... Key words: Fat percent, Iranian Holstein cattle, microsatellites, milking days, protein percent, quantitative trait locus (QTL).

  17. Quantitative analysis of gai pattern in hemiparetic patients | Zverev ...

    African Journals Online (AJOL)

    Objective: To characterise gait pattern in hemiparetic patients quantitatively using clinical footprint method. Design: A case control study. Subjects: Sixteen hemiparetic patients (12 males and 4 females) aged 16 to 64 years who attended neurological clinic at Queen Elizabeth Central Hospital, Blantyre, Malawi.

  18. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    Science.gov (United States)

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  19. Quantitative analysis of fish wake dynamics using volumetric PIV data

    Science.gov (United States)

    Mendelson, Leah; Techet, Alexandra

    2013-11-01

    In the study of swimming hydrodynamics, the fluid impulse in the wake is used to quantify the momentum transferred by the fish as it swims. This impulse is typically computed from planar PIV measurements of the wake circulation and geometry by assuming an axisymmetric vortex ring model. However, in many propulsive and maneuvering scenarios, three-dimensional effects are of substantial importance, and wake features are not often an isolated, symmetric vortex ring. Volumetric PIV data provides a complete measure of the vortex geometry and orientation, and circulation can be determined over multiple planar slices through the volume. Using sample datasets obtained from synthetic aperture PIV (SAPIV), we demonstrate how the availability of volumetric PIV data enables more detailed analysis of hydrodynamic impulse and characterize the uncertainty created by planar measurements. Special attention is paid to unsteady maneuvering behaviors that generate asymmetric and linked wake features.

  20. QUANTITATIVE ANALYSIS OF BANDED STRUCTURES IN DUAL-PHASE STEELS

    Directory of Open Access Journals (Sweden)

    Benoit Krebs

    2011-05-01

    Full Text Available Dual-Phase (DP steels are composed of martensite islands dispersed in a ductile ferrite matrix, which provides a good balance between strength and ductility. Current processing conditions (continuous casting followed by hot and cold rolling generate 'banded structures' i.e., irregular, parallel and alternating bands of ferrite and martensite, which are detrimental to mechanical properties and especially for in-use properties. We present an original and simple method to quantify the intensity and wavelength of these bands. This method, based on the analysis of covariance function of binary images, is firstly tested on model images. It is compared with ASTM E-1268 standard and appears to be more robust. Then it is applied on real DP steel microstructures and proves to be sufficiently sensitive to discriminate samples resulting from different thermo-mechanical routes.

  1. Quantitative Analysis of Spectral Impacts on Silicon Photodiode Radiometers: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Myers, D. R.

    2011-04-01

    Inexpensive broadband pyranometers with silicon photodiode detectors have a non-uniform spectral response over the spectral range of 300-1100 nm. The response region includes only about 70% to 75% of the total energy in the terrestrial solar spectral distribution from 300 nm to 4000 nm. The solar spectrum constantly changes with solar position and atmospheric conditions. Relative spectral distributions of diffuse hemispherical irradiance sky radiation and total global hemispherical irradiance are drastically different. This analysis convolves a typical photodiode response with SMARTS 2.9.5 spectral model spectra for different sites and atmospheric conditions. Differences in solar component spectra lead to differences on the order of 2% in global hemispherical and 5% or more in diffuse hemispherical irradiances from silicon radiometers. The result is that errors of more than 7% can occur in the computation of direct normal irradiance from global hemispherical irradiance and diffuse hemispherical irradiance using these radiometers.

  2. The proton induced X-ray emission (PIXE) for the quantitative analysis of elements in thin samples, in surface layers of thick samples, and in aerosol filters

    International Nuclear Information System (INIS)

    Waetjen, U.

    1983-01-01

    The PIXE analysis method for the determination of elements in thick samples was investigated. The text of the present thesis is arranged under the following headings: physical fundamentals and measuring equipment, quantitative analysis of thin samples, matrix effects at the PIXE analysis of thick samples, matrix correction methods, analysis of 'infinite thick' model substances, PIXE analysis of aerosol filters. (GSCH)

  3. Quantitative atom probe analysis of nanostructure containing clusters and precipitates with multiple length scales

    International Nuclear Information System (INIS)

    Marceau, R.K.W.; Stephenson, L.T.; Hutchinson, C.R.; Ringer, S.P.

    2011-01-01

    A model Al-3Cu-(0.05 Sn) (wt%) alloy containing a bimodal distribution of relatively shear-resistant θ' precipitates and shearable GP zones is considered in this study. It has recently been shown that the addition of the GP zones to such microstructures can lead to significant increases in strength without a decrease in the uniform elongation. In this study, atom probe tomography (APT) has been used to quantitatively characterise the evolution of the GP zones and the solute distribution in the bimodal microstructure as a function of applied plastic strain. Recent nuclear magnetic resonance (NMR) analysis has clearly shown strain-induced dissolution of the GP zones, which is supported by the current APT data with additional spatial information. There is significant repartitioning of Cu from the GP zones into the solid solution during deformation. A new approach for cluster finding in APT data has been used to quantitatively characterise the evolution of the sizes and shapes of the Cu containing features in the solid solution solute as a function of applied strain. -- Research highlights: → A new approach for cluster finding in atom probe tomography (APT) data has been used to quantitatively characterise the evolution of the sizes and shapes of the Cu containing features with multiple length scales. → In this study, a model Al-3Cu-(0.05 Sn) (wt%) alloy containing a bimodal distribution of relatively shear-resistant θ' precipitates and shearable GP zones is considered. → APT has been used to quantitatively characterise the evolution of the GP zones and the solute distribution in the bimodal microstructure as a function of applied plastic strain. → It is clearly shown that there is strain-induced dissolution of the GP zones with significant repartitioning of Cu from the GP zones into the solid solution during deformation.

  4. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  5. [Optimization of experimental parameters for quantitative NMR (qNMR) and its application in quantitative analysis of traditional Chinese medicines].

    Science.gov (United States)

    Ma, Xiao-Li; Zou, Ping-Ping; Lei, Wei; Tu, Peng-Fei; Jiang, Yong

    2014-09-01

    Quantitative NMR (qNMR) is a technology based on the principle of NMR. This technology does not need the references of the determined components, which supplies a solution for the problem of reference scarcity in the quantitative analysis of traditional Chinese medicines. Moreover, this technology has the advantages of easy operation, non-destructiveness for the determined sample, high accuracy and repeatability, in comparison with HPLC, LC-MS and GC-MS. NMR technology has achieved quantum leap in sensitivity and accuracy with the development of NMR hardware. In addition, the choice of appropriate experimental parameters of the pre-treatment and measurement procedure as well as the post-acquisition processing is also important for obtaining high-quality and reproducible NMR spectra. This review summarizes the principle of qHNMR, the various experimental parameters affecting the accuracy and the precision of qHNMR, such as signal to noise ratio, relaxation delay, pulse width, acquisition time, window function, phase correction and baseline correction, and their corresponding optimized methods. Moreover, the application of qHNMR in the fields of quantitation of single or multi-components of traditional Chinese medicines, the purity detection of references, and the quality analysis of foods has been discussed. In addition, the existing questions and the future application prospects of qNMR in natural product areas are also presented.

  6. Quantitative video-based gait pattern analysis for hemiparkinsonian rats.

    Science.gov (United States)

    Lee, Hsiao-Yu; Hsieh, Tsung-Hsun; Liang, Jen-I; Yeh, Ming-Long; Chen, Jia-Jin J

    2012-09-01

    Gait disturbances are common in the rat model of Parkinson's disease (PD) by administrating 6-hydroxydopamine. However, few studies have simultaneously assessed spatiotemporal gait indices and the kinematic information of PD rats during overground locomotion. This study utilized a simple, accurate, and reproducible method for quantifying the spatiotemporal and kinematic changes of gait patterns in hemiparkinsonian rats. A transparent walkway with a tilted mirror was set to capture underview footprints and lateral joint ankle images using a high-speed and high-resolution digital camera. The footprint images were semi-automatically processed with a threshold setting to identify the boundaries of soles and the critical points of each hindlimb for deriving the spatiotemporal and kinematic indices of gait. Following PD lesion, asymmetrical gait patterns including a significant decrease in the step/stride length and increases in the base of support and ankle joint angle were found. The increased footprint length, toe spread, and intermediary toe spread were found, indicating a compensatory gait pattern for impaired locomotor function. The temporal indices showed a significant decrease in the walking speed with increased durations of the stance/swing phase and double support time, which was more evident in the affected hindlimb. Furthermore, the ankle kinematic data showed that the joint angle decreased at the toe contact stage. We conclude that the proposed gait analysis method can be used to precisely detect locomotor function changes in PD rats, which is useful for objective assessments of investigating novel treatments for PD animal model.

  7. Public Library System in Ankara: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bülent Yılmaz

    2014-12-01

    Full Text Available This study investigates 42 public libraries in 25 central districts within the boundaries of Ankara Metropolitan Municipality in respect of five factors according to national and international standards quantitatively. The findings show that public libraries in Ankara are insufficient with respect to the number of buildings, users, staff and collection and also in terms of standards. Therefore, it has been suggested that an urgent planning is necessary for public libraries in Ankara.

  8. Geometrical conditions at the quantitative neutronographic texture analysis

    International Nuclear Information System (INIS)

    Tobisch, J.; Kleinstueck, K.

    1975-10-01

    The beam geometry for measuring quantitative pole figures by a neutronographic texture diffractometer is explained for transmission and reflection arrangement of spherical samples and sheets. For given dimensions of counter aperture the maximum possible cross sections of the incident beam are calculated as a function of sample dimensions and the Bragg angle theta. Methods for the calculation of absorption factors and volume correction are given. Under special conditions advantages result in the transmission case for sample motion into the direction +α. (author)

  9. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  10. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  11. Statistical Frailty Modeling for Quantitative Analysis of Exocytotic Events Recorded by Live Cell Imaging: Rapid Release of Insulin-Containing Granules Is Impaired in Human Diabetic β-cells.

    Science.gov (United States)

    Cortese, Giuliana; Gandasi, Nikhil R; Barg, Sebastian; Pedersen, Morten Gram

    2016-01-01

    Hormones and neurotransmitters are released when secretory granules or synaptic vesicles fuse with the cell membrane, a process denoted exocytosis. Modern imaging techniques, in particular total internal reflection fluorescence (TIRF) microscopy, allow the investigator to monitor secretory granules at the plasma membrane before and when they undergo exocytosis. However, rigorous statistical approaches for temporal analysis of such exocytosis data are still lacking. We propose here that statistical methods from time-to-event (also known as survival) analysis are well suited for the problem. These methods are typically used in clinical settings when individuals are followed over time to the occurrence of an event such as death, remission or conception. We model the rate of exocytosis in response to pulses of stimuli in insulin-secreting pancreatic β-cell from healthy and diabetic human donors using piecewise-constant hazard modeling. To study heterogeneity in the granule population we exploit frailty modeling, which describe unobserved differences in the propensity to exocytosis. In particular, we insert a discrete frailty in our statistical model to account for the higher rate of exocytosis in an immediately releasable pool (IRP) of insulin-containing granules. Estimates of parameters are obtained from maximum-likelihood methods. Since granules within the same cell are correlated, i.e., the data are clustered, a modified likelihood function is used for log-likelihood ratio tests in order to perform valid inference. Our approach allows us for example to estimate the size of the IRP in the cells, and we find that the IRP is deficient in diabetic cells. This novel application of time-to-event analysis and frailty modeling should be useful also for the study of other well-defined temporal events at the cellular level.

  12. A quantitative analysis of the reactions involved in stratospheric ozone depletion in the polar vortex core

    Science.gov (United States)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-09-01

    We present a quantitative analysis of the chemical reactions involved in polar ozone depletion in the stratosphere and of the relevant reaction pathways and cycles. While the reactions involved in polar ozone depletion are well known, quantitative estimates of the importance of individual reactions or reaction cycles are rare. In particular, there is no comprehensive and quantitative study of the reaction rates and cycles averaged over the polar vortex under conditions of heterogeneous chemistry so far. We show time series of reaction rates averaged over the core of the polar vortex in winter and spring for all relevant reactions and indicate which reaction pathways and cycles are responsible for the vortex-averaged net change of the key species involved in ozone depletion, i.e., ozone, chlorine species (ClOx, HCl, ClONO2), bromine species, nitrogen species (HNO3, NOx) and hydrogen species (HOx). For clarity, we focus on one Arctic winter (2004-2005) and one Antarctic winter (2006) in a layer in the lower stratosphere around 54 hPa and show results for additional pressure levels and winters in the Supplement. Mixing ratios and reaction rates are obtained from runs of the ATLAS Lagrangian chemistry and transport model (CTM) driven by the European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-Interim reanalysis data. An emphasis is put on the partitioning of the relevant chemical families (nitrogen, hydrogen, chlorine, bromine and odd oxygen) and activation and deactivation of chlorine.

  13. A quantitative analysis of the reactions involved in stratospheric ozone depletion in the polar vortex core

    Directory of Open Access Journals (Sweden)

    I. Wohltmann

    2017-09-01

    Full Text Available We present a quantitative analysis of the chemical reactions involved in polar ozone depletion in the stratosphere and of the relevant reaction pathways and cycles. While the reactions involved in polar ozone depletion are well known, quantitative estimates of the importance of individual reactions or reaction cycles are rare. In particular, there is no comprehensive and quantitative study of the reaction rates and cycles averaged over the polar vortex under conditions of heterogeneous chemistry so far. We show time series of reaction rates averaged over the core of the polar vortex in winter and spring for all relevant reactions and indicate which reaction pathways and cycles are responsible for the vortex-averaged net change of the key species involved in ozone depletion, i.e., ozone, chlorine species (ClOx, HCl, ClONO2, bromine species, nitrogen species (HNO3, NOx and hydrogen species (HOx. For clarity, we focus on one Arctic winter (2004–2005 and one Antarctic winter (2006 in a layer in the lower stratosphere around 54 hPa and show results for additional pressure levels and winters in the Supplement. Mixing ratios and reaction rates are obtained from runs of the ATLAS Lagrangian chemistry and transport model (CTM driven by the European Centre for Medium-Range Weather Forecasts (ECMWF ERA-Interim reanalysis data. An emphasis is put on the partitioning of the relevant chemical families (nitrogen, hydrogen, chlorine, bromine and odd oxygen and activation and deactivation of chlorine.

  14. Quantitation of DNA methylation by melt curve analysis

    Directory of Open Access Journals (Sweden)

    Jones Michael E

    2009-04-01

    Full Text Available Abstract Background Methylation of DNA is a common mechanism for silencing genes, and aberrant methylation is increasingly being implicated in many diseases such as cancer. There is a need for robust, inexpensive methods to quantitate methylation across a region containing a number of CpGs. We describe and validate a rapid, in-tube method to quantitate DNA methylation using the melt data obtained following amplification of bisulfite modified DNA in a real-time thermocycler. Methods We first describe a mathematical method to normalise the raw fluorescence data generated by heating the amplified bisulfite modified DNA. From this normalised data the temperatures at which melting begins and finishes can be calculated, which reflect the less and more methylated template molecules present respectively. Also the T50, the temperature at which half the amplicons are melted, which represents the summative methylation of all the CpGs in the template mixture, can be calculated. These parameters describe the methylation characteristics of the region amplified in the original sample. Results For validation we used synthesized oligonucleotides and DNA from fresh cells and formalin fixed paraffin embedded tissue, each with known methylation. Using our quantitation we could distinguish between unmethylated, partially methylated and fully methylated oligonucleotides mixed in varying ratios. There was a linear relationship between T50 and the dilution of methylated into unmethylated DNA. We could quantitate the change in methylation over time in cell lines treated with the demethylating drug 5-aza-2'-deoxycytidine, and the differences in methylation associated with complete, clonal or no loss of MGMT expression in formalin fixed paraffin embedded tissues. Conclusion We have validated a rapid, simple in-tube method to quantify methylation which is robust and reproducible, utilizes easily designed primers and does not need proprietary algorithms or software. The

  15. Scalp Surgery: Quantitative Analysis of Follicular Unit Growth

    Science.gov (United States)

    Caruana, Giorgia

    2015-01-01

    Background: Over the years, different kinds of hair transplantation have been compared in an attempt to overcome male pattern alopecia and, at the same time, maximize both the survival and growth rate of grafted hair. In this study, we have assessed the survival and growth rate of follicular units (FU) in an in vitro model, as compared with that of conventional hair micrografts, to experimentally evaluate and elaborate on the differences between these 2 approaches in hair transplantation procedures. Methods: Group A (control; n = 100 follicles) was composed of hair micrografts, whereas FUs were assigned to Group B (experimental; n = 100 follicles, n = 35 FUs). Each group was cultured for a period of 10 days; the total stretch of follicles was measured soon after the harvest and 10 days later. The Kruskal-Wallis one-way analysis of variance on ranks test was used to perform statistical analysis. Results: The growth rate of follicles from Group A (mean 10-day shaft growth rate = 0.30 mm) proved to be statistically different compared with that of Group B (mean 10-day shaft growth rate = 0.23 mm). Conversely, our data did not show any significant difference between the survival rate of hair grafts from these 2 groups. Conclusions: Our data highlighted a reduced FU shaft growth compared with that of hair micrografts, corroborating, to a certain extent, the hypothesis that a significant amount of adipose tissue surrounding the follicle included in the graft may result in an inadequate nourishment supply to follicular cells. PMID:26579345

  16. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  17. Quantitative Proteomics for the Comprehensive Analysis of Stress Responses of Lactobacillus paracasei subsp. paracasei F19.

    Science.gov (United States)

    Schott, Ann-Sophie; Behr, Jürgen; Geißler, Andreas J; Kuster, Bernhard; Hahne, Hannes; Vogel, Rudi F

    2017-10-06

    Lactic acid bacteria are broadly employed as starter cultures in the manufacture of foods. Upon technological preparation, they are confronted with drying stress that amalgamates numerous stress conditions resulting in losses of fitness and survival. To better understand and differentiate physiological stress responses, discover general and specific markers for the investigated stress conditions, and predict optimal preconditioning for starter cultures, we performed a comprehensive genomic and quantitative proteomic analysis of a commonly used model system, Lactobacillus paracasei subsp. paracasei TMW 1.1434 (isogenic with F19) under 11 typical stress conditions, including among others oxidative, osmotic, pH, and pressure stress. We identified and quantified >1900 proteins in triplicate analyses, representing 65% of all genes encoded in the genome. The identified genes were thoroughly annotated in terms of subcellular localization prediction and biological functions, suggesting unbiased and comprehensive proteome coverage. In total, 427 proteins were significantly differentially expressed in at least one condition. Most notably, our analysis suggests that optimal preconditioning toward drying was predicted to be alkaline and high-pressure stress preconditioning. Taken together, we believe the presented strategy may serve as a prototypic example for the analysis and utility of employing quantitative-mass-spectrometry-based proteomics to study bacterial physiology.

  18. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  19. Spatiotemporal microbiota dynamics from quantitative in vitro and in silico models of the gut

    Science.gov (United States)

    Hwa, Terence

    The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth behaviors, which ultimately dictate the gut microbiota composition. Combining measurements of bacterial growth physiology with analysis of published data on human physiology into a quantitative modeling framework, we show how hydrodynamic forces in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla in the gut. Our model quantitatively explains the observed variation of microbiota composition among healthy adults, and predicts colonic water absorption (manifested as stool consistency) and nutrient intake to be two key factors determining this composition. The model further reveals that both factors, which have been identified in recent correlative studies, exert their effects through the same mechanism: changes in colonic pH that differentially affect the growth of different bacteria. Our findings show that a predictive and mechanistic understanding of microbial ecology in the human gut is possible, and offer the hope for the rational design of intervention strategies to actively control the microbiota. This work is supported by the Bill and Melinda Gates Foundation.

  20. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity

  1. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity.

  2. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  3. Nonlinear quantitative radiation sensitivity prediction model based on NCI-60 cancer cell lines.

    Science.gov (United States)

    Zhang, Chunying; Girard, Luc; Das, Amit; Chen, Sun; Zheng, Guangqiang; Song, Kai

    2014-01-01

    We proposed a nonlinear model to perform a novel quantitative radiation sensitivity prediction. We used the NCI-60 panel, which consists of nine different cancer types, as the platform to train our model. Important radiation therapy (RT) related genes were selected by significance analysis of microarrays (SAM). Orthogonal latent variables (LVs) were then extracted by the partial least squares (PLS) method as the new compressive input variables. Finally, support vector machine (SVM) regression model was trained with these LVs to predict the SF2 (the surviving fraction of cells after a radiation dose of 2 Gy γ-ray) values of the cell lines. Comparison with the published results showed significant improvement of the new method in various ways: (a) reducing the root mean square error (RMSE) of the radiation sensitivity prediction model from 0.20 to 0.011; and (b) improving prediction accuracy from 62% to 91%. To test the predictive performance of the gene signature, three different types of cancer patient datasets were used. Survival analysis across these different types of cancer patients strongly confirmed the clinical potential utility of the signature genes as a general prognosis platform. The gene regulatory network analysis identified six hub genes that are involved in canonical cancer pathways.

  4. Nonlinear Quantitative Radiation Sensitivity Prediction Model Based on NCI-60 Cancer Cell Lines

    Directory of Open Access Journals (Sweden)

    Chunying Zhang

    2014-01-01

    Full Text Available We proposed a nonlinear model to perform a novel quantitative radiation sensitivity prediction. We used the NCI-60 panel, which consists of nine different cancer types, as the platform to train our model. Important radiation therapy (RT related genes were selected by significance analysis of microarrays (SAM. Orthogonal latent variables (LVs were then extracted by the partial least squares (PLS method as the new compressive input variables. Finally, support vector machine (SVM regression model was trained with these LVs to predict the SF2 (the surviving fraction of cells after a radiation dose of 2 Gy γ-ray values of the cell lines. Comparison with the published results showed significant improvement of the new method in various ways: (a reducing the root mean square error (RMSE of the radiation sensitivity prediction model from 0.20 to 0.011; and (b improving prediction accuracy from 62% to 91%. To test the predictive performance of the gene signature, three different types of cancer patient datasets were used. Survival analysis across these different types of cancer patients strongly confirmed the clinical potential utility of the signature genes as a general prognosis platform. The gene regulatory network analysis identified six hub genes that are involved in canonical cancer pathways.

  5. Quantitative structure-activity relationship models of chemical transformations from matched pairs analyses.

    Science.gov (United States)

    Beck, Jeremy M; Springer, Clayton

    2014-04-28

    The concepts of activity cliffs and matched molecular pairs (MMP) are recent paradigms for analysis of data sets to identify structural changes that may be used to modify the potency of lead molecules in drug discovery projects. Analysis of MMPs was recently demonstrated as a feasible technique for quantitative structure-activity relationship (QSAR) modeling of prospective compounds. Although within a small data set, the lack of matched pairs, and the lack of knowledge about specific chemical transformations limit prospective applications. Here we present an alternative technique that determines pairwise descriptors for each matched pair and then uses a QSAR model to estimate the activity change associated with a chemical transformation. The descriptors effectively group similar transformations and incorporate information about the transformation and its local environment. Use of a transformation QSAR model allows one to estimate the activity change for novel transformations and therefore returns predictions for a larger fraction of test set compounds. Application of the proposed methodology to four public data sets results in increased model performance over a benchmark random forest and direct application of chemical transformations using QSAR-by-matched molecular pairs analysis (QSAR-by-MMPA).

  6. Quantitative analysis of error mode, error effect and criticality

    International Nuclear Information System (INIS)

    Li Pengcheng; Zhang Li; Xiao Dongsheng; Chen Guohua

    2009-01-01

    The quantitative method of human error mode, effect and criticality is developed in order to reach the ultimate goal of Probabilistic Safety Assessment. The criticality identification matrix of human error mode and task is built to identify the critical human error mode and task and the critical organizational root causes on the basis of the identification of human error probability, error effect probability and the criticality index of error effect. Therefore, this will be beneficial to take targeted measures to reduce and prevent the occurrence of critical human error mode and task. Finally, the application of the technique is explained through the application example. (authors)

  7. Quantitative Analysis and Design of a Rudder Roll Damping Controller

    DEFF Research Database (Denmark)

    Hearns, G.; Blanke, M.

    1998-01-01

    A rudder roll damping controller is designed using Quantitative feedback theory to be robust for changes in the ships metacentric height. The analytical constraint due to the non-minimum phase behaviour of the rudder to roll is analysed using the Poisson Integral Formula and it is shown how...... the design tradeoffs in closed-loop roll reduction can be approximated using this formula before a controller is designed. The robust roll and course keeping controllers designed are then tested using a nonlinear simulation....

  8. Quantitative analysis of water heavy by NMR spectroscopy

    International Nuclear Information System (INIS)

    Gomez Gil, V.

    1975-01-01

    Nuclear Magnetic Resonance has been applied to a wide variety of quantitative problems. A typical example has been the determination of isotopic composition. In this paper two different analytical methods for the determination of water in deuterium oxide are described. The first one, employs acetonitril as an internal standard compound and in the second one calibration curve of signal integral curve versus amount of D 2 O is constructed. Both methods give results comparable to those of mass spectrometry of IR spectroscopy. (Author) 5 refs

  9. Comparative Proteomic Analysis of Bleomycin-induced Pulmonary Fibrosis Based on Isobaric Tag for Quantitation.

    Science.gov (United States)

    Yang, Tiejun; Jia, Yanlong; Ma, Yongkang; Cao, Liang; Chen, Xiaobing; Qiao, Baoping

    2017-01-01

    Pulmonary fibrosis (PF) is a destructive pulmonary disease and the molecular mechanisms underlying PF are unclear. This study investigated differentially expressed proteins associated with the occurrence and development of PF in rat lung tissue with bleomycin-induced PF. Sixteen Sprague-Dawley rats were randomly divided into 2 groups: the PF model group (n = 8) and the control group (n = 8). After successfully establishing the rat PF model induced by bleomycin, the differentially expressed proteins in the 2 groups were identified through isobaric tag for relative and absolute quantitation coupled with liquid chromatography-mass spectrometry and bioinformatics analysis. A total of 146 differentially expressed proteins were identified; 88 of which displayed increased abundance and 58 were downregulated in the PF rat model group. Most functional proteins were associated with extracellular matrix, inflammation, damage response, vitamin A synthesis and metabolism. Critical proteins related to PF development and progression was identified, such as type V collagen-3, arachidonic acid 12-lipoxygenase, arachidonic acid 15-lipoxygenase and cytochrome P4501A1. Kyoto Encyclopedia of Genes and Genomes pathway analysis showed that these differentially expressed proteins were enriched in extracellular matrix receptor interaction pathway, renin-angiotensin system and metabolic pathway of retinol. The proteins expressed in bleomycin-induced PF rat model provide important data for further functional analysis of proteins involved in PF. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Quantitative analysis of Howell-Jolly bodies in children with sickle cell disease.

    Science.gov (United States)

    Harrod, Virginia L; Howard, Thad A; Zimmerman, Sherri A; Dertinger, Stephen D; Ware, Russell E

    2007-02-01

    Although functional asplenia in sickle cell disease (SCD) begins early in life and has important clinical consequences, quantitative measurement of splenic function is not readily available. A novel high-throughput flow cytometric method for quantitating Howell-Jolly bodies (HJB) has been developed which isolates HJB-containing CD71(+) and CD71(-) erythrocytes. Analysis of these cell populations allows quantitative measurement of splenic filtrative function and possible chromosomal damage. Blood specimens from 147 children with SCD were analyzed using a high-throughput flow cytometric method. Enumeration of the following populations was accomplished: 1) CD71(+) reticulocytes among total erythrocytes, identifying the youngest erythroid cell population; 2) HJB-containing CD71(+) reticulocytes, which isolate young erythrocytes containing micronuclei as an index of cytogenetic damage; and 3) HJB-containing CD71(-) erythrocytes, identifying older erythrocytes containing micronuclei, indirectly measuring splenic function. Children with HbSC (n = 24) had slightly elevated HJB frequencies, while children with HbSS (n = 125) had highly elevated frequencies within CD71(+) cells (0.44% +/- 0.40%, normal 0.12% +/- 0.06%, p < 0.001) and CD71(-) cells (2493 +/- 2303 per million RBC, normal 20 +/- 11, p < 0.001). Using a multiple regression model, the frequency of HbSS CD71(+) reticulocytes containing HJB was significantly influenced by hydroxyurea use (p < 0.0001), age (p = 0.0288), and splenectomy (p = 0.0498). Similarly, mature CD71(-) erythrocytes containing HJB were positively correlated with hydroxyurea (p = 0.0001), age (p < 0.0001), and splenectomy (p = 0.0104). HJB quantitation by flow cytometry is a novel assay for measuring splenic function and may be valuable for investigating the efficacy and safety of therapeutic options for children with SCD.

  11. Diagnostic value of semi-quantitative and quantitative analysis of functional parameters in multiparametric MRI of the prostate.

    Science.gov (United States)

    Hauth, Elke; Halbritter, Daniela; Jaeger, Horst; Hohmuth, Horst; Beer, Meinrad

    2017-10-01

    To determine the diagnostic value of semi-quantitative and quantitative parameters of three functional techniques in multiparametric (mp)-MRI of the prostate. Mp-MRI was performed in 110 patients with suspicion of prostate cancer (PCA) before transrectal ultrasound (TRUS)-guided core biopsy. Peak-enhancement, initial and post-initial enhancement, initial area under gadolinium curve, Ktrans (forward rate constant), Kep (efflux rate constant), Ve (extracellular volume), ADC (apparent diffusion coefficient) and MR spectroscopy ratio were obtained for malignant and benign lesions. For iAUGC, Ktrans, Kep and Ve we evaluated median, mean and the difference (Diff) between mean and median. For ADC we evaluated mean, median, Diff between mean and median, and min. In addition, we evaluated these parameters in dependence of Gleason score in PCA. Receiver operating characteristic analysis and area under curve (AUC) were determined. ADC min and Kep Diff were the best predictors of malignancy in all lesions (AUC: 0.765). ADC min was the best predictor of malignancy for lesions in peripheral zone (PZ) (AUC: 0.7506) and Kep Diff was the best predictor of malignancy for lesions in transit