WorldWideScience

Sample records for model quantitative analysis

  1. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  2. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  3. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  4. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  5. Quantitative Analysis of Polarimetric Model-Based Decomposition Methods

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2016-11-01

    Full Text Available In this paper, we analyze the robustness of the parameter inversion provided by general polarimetric model-based decomposition methods from the perspective of a quantitative application. The general model and algorithm we have studied is the method proposed recently by Chen et al., which makes use of the complete polarimetric information and outperforms traditional decomposition methods in terms of feature extraction from land covers. Nevertheless, a quantitative analysis on the retrieved parameters from that approach suggests that further investigations are required in order to fully confirm the links between a physically-based model (i.e., approaches derived from the Freeman–Durden concept and its outputs as intermediate products before any biophysical parameter retrieval is addressed. To this aim, we propose some modifications on the optimization algorithm employed for model inversion, including redefined boundary conditions, transformation of variables, and a different strategy for values initialization. A number of Monte Carlo simulation tests for typical scenarios are carried out and show that the parameter estimation accuracy of the proposed method is significantly increased with respect to the original implementation. Fully polarimetric airborne datasets at L-band acquired by German Aerospace Center’s (DLR’s experimental synthetic aperture radar (E-SAR system were also used for testing purposes. The results show different qualitative descriptions of the same cover from six different model-based methods. According to the Bragg coefficient ratio (i.e., β , they are prone to provide wrong numerical inversion results, which could prevent any subsequent quantitative characterization of specific areas in the scene. Besides the particular improvements proposed over an existing polarimetric inversion method, this paper is aimed at pointing out the necessity of checking quantitatively the accuracy of model-based PolSAR techniques for a

  6. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  7. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  8. Quantitative modeling and data analysis of SELEX experiments

    Science.gov (United States)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (systematic evolution of ligands by exponential enrichment) is an experimental procedure that allows the extraction, from an initially random pool of DNA, of those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate dataset. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such a modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure.

  9. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  10. Epistasis analysis for quantitative traits by functional regression model.

    Science.gov (United States)

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  11. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  12. Software applications toward quantitative metabolic flux analysis and modeling.

    Science.gov (United States)

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  13. Spine curve modeling for quantitative analysis of spinal curvature.

    Science.gov (United States)

    Hay, Ori; Hershkovitz, Israel; Rivlin, Ehud

    2009-01-01

    Spine curvature and posture are important to sustain healthy back. Incorrect spine configuration can add strain to muscles and put stress on the spine, leading to low back pain (LBP). We propose new method for analyzing spine curvature in 3D, using CT imaging. The proposed method is based on two novel concepts: the spine curvature is derived from spinal canal centerline, and evaluation of the curve is carried out against a model based on healthy individuals. We show results of curvature analysis of healthy population, pathological (scoliosis) patients, and patients having nonspecific chronic LBP.

  14. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  15. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...

  16. Development of probabilistic models for quantitative pathway analysis of plant pests introduction for the EU territory

    NARCIS (Netherlands)

    Douma, J.C.; Robinet, C.; Hemerik, L.; Mourits, M.C.M.; Roques, A.; Werf, van der W.

    2015-01-01

    The aim of this report is to provide EFSA with probabilistic models for quantitative pathway analysis of plant pest introduction for the EU territory through non-edible plant products or plants. We first provide a conceptualization of two types of pathway models. The individual based PM simulates an

  17. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  19. [Study on temperature correctional models of quantitative analysis with near infrared spectroscopy].

    Science.gov (United States)

    Zhang, Jun; Chen, Hua-cai; Chen, Xing-dan

    2005-06-01

    Effect of enviroment temperature on near infrared spectroscopic quantitative analysis was studied. The temperature correction model was calibrated with 45 wheat samples at different environment temperaturs and with the temperature as an external variable. The constant temperature model was calibated with 45 wheat samples at the same temperature. The predicted results of two models for the protein contents of wheat samples at different temperatures were compared. The results showed that the mean standard error of prediction (SEP) of the temperature correction model was 0.333, but the SEP of constant temperature (22 degrees C) model increased as the temperature difference enlarged, and the SEP is up to 0.602 when using this model at 4 degrees C. It was suggested that the temperature correctional model improves the analysis precision.

  20. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  1. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun;

    2013-01-01

    , comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  2. A Classifier Model based on the Features Quantitative Analysis for Facial Expression Recognition

    Directory of Open Access Journals (Sweden)

    Amir Jamshidnezhad

    2011-01-01

    Full Text Available In recent decades computer technology has considerable developed in use of intelligent systems for classification. The development of HCI systems is highly depended on accurate understanding of emotions. However, facial expressions are difficult to classify by a mathematical models because of natural quality. In this paper, quantitative analysis is used in order to find the most effective features movements between the selected facial feature points. Therefore, the features are extracted not only based on the psychological studies, but also based on the quantitative methods to arise the accuracy of recognitions. Also in this model, fuzzy logic and genetic algorithm are used to classify facial expressions. Genetic algorithm is an exclusive attribute of proposed model which is used for tuning membership functions and increasing the accuracy.

  3. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  4. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations....... We present an algorithm for the translation of such models into Markov decision processes (MDP) expressed in the syntax of the PRISM model checker. This enables precise quantitative analysis of business processes for the following properties: transient and steady-state probabilities, the timing......, occurrence and ordering of events, reward-based properties, and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover...

  5. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  6. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies.

  7. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  8. Three-dimensional modeling and quantitative analysis of gap junction distributions in cardiac tissue.

    Science.gov (United States)

    Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W

    2011-11-01

    Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.

  9. Growth mixture modeling as an exploratory analysis tool in longitudinal quantitative trait loci analysis.

    Science.gov (United States)

    Chang, Su-Wei; Choi, Seung Hoan; Li, Ke; Fleur, Rose Saint; Huang, Chengrui; Shen, Tong; Ahn, Kwangmi; Gordon, Derek; Kim, Wonkuk; Wu, Rongling; Mendell, Nancy R; Finch, Stephen J

    2009-12-15

    We examined the properties of growth mixture modeling in finding longitudinal quantitative trait loci in a genome-wide association study. Two software packages are commonly used in these analyses: Mplus and the SAS TRAJ procedure. We analyzed the 200 replicates of the simulated data with these programs using three tests: the likelihood-ratio test statistic, a direct test of genetic model coefficients, and the chi-square test classifying subjects based on the trajectory model's posterior Bayesian probability. The Mplus program was not effective in this application due to its computational demands. The distributions of these tests applied to genes not related to the trait were sensitive to departures from Hardy-Weinberg equilibrium. The likelihood-ratio test statistic was not usable in this application because its distribution was far from the expected asymptotic distributions when applied to markers with no genetic relation to the quantitative trait. The other two tests were satisfactory. Power was still substantial when we used markers near the gene rather than the gene itself. That is, growth mixture modeling may be useful in genome-wide association studies. For markers near the actual gene, there was somewhat greater power for the direct test of the coefficients and lesser power for the posterior Bayesian probability chi-square test.

  10. Model exploration and analysis for quantitative safety refinement in probabilistic B

    CERN Document Server

    Ndukwu, Ukachukwu; 10.4204/EPTCS.55.7

    2011-01-01

    The role played by counterexamples in standard system analysis is well known; but less common is a notion of counterexample in probabilistic systems refinement. In this paper we extend previous work using counterexamples to inductive invariant properties of probabilistic systems, demonstrating how they can be used to extend the technique of bounded model checking-style analysis for the refinement of quantitative safety specifications in the probabilistic B language. In particular, we show how the method can be adapted to cope with refinements incorporating probabilistic loops. Finally, we demonstrate the technique on pB models summarising a one-step refinement of a randomised algorithm for finding the minimum cut of undirected graphs, and that for the dependability analysis of a controller design.

  11. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  12. Flow assignment model for quantitative analysis of diverting bulk freight from road to railway.

    Science.gov (United States)

    Liu, Chang; Lin, Boliang; Wang, Jiaxi; Xiao, Jie; Liu, Siqi; Wu, Jianping; Li, Jian

    2017-01-01

    Since railway transport possesses the advantage of high volume and low carbon emissions, diverting some freight from road to railway will help reduce the negative environmental impacts associated with transport. This paper develops a flow assignment model for quantitative analysis of diverting truck freight to railway. First, a general network which considers road transportation, railway transportation, handling and transferring is established according to all the steps in the whole transportation process. Then general functions which embody the factors which the shippers will pay attention to when choosing mode and path are formulated. The general functions contain the congestion cost on road, the capacity constraints of railways and freight stations. Based on the general network and general cost function, a user equilibrium flow assignment model is developed to simulate the flow distribution on the general network under the condition that all shippers choose transportation mode and path independently. Since the model is nonlinear and challenging, we adopt a method that uses tangent lines to constitute envelope curve to linearize it. Finally, a numerical example is presented to test the model and show the method of making quantitative analysis of bulk freight modal shift between road and railway.

  13. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case.

  14. Quantitative analysis of markers of podocyte injury in the rat puromycin aminonucleoside nephropathy model.

    Science.gov (United States)

    Kakimoto, Tetsuhiro; Okada, Kinya; Fujitaka, Keisuke; Nishio, Masashi; Kato, Tsuyoshi; Fukunari, Atsushi; Utsumi, Hiroyuki

    2015-02-01

    Podocytes are an essential component of the renal glomerular filtration barrier, their injury playing an early and important role in progressive renal dysfunction. This makes quantification of podocyte marker immunoreactivity important for early detection of glomerular histopathological changes. Here we have specifically applied a state-of-the-art automated computational method of glomerulus recognition, which we have recently developed, to study quantitatively podocyte markers in a model with selective podocyte injury, namely the rat puromycin aminonucleoside (PAN) nephropathy model. We also retrospectively investigated mRNA expression levels of these markers in glomeruli which were isolated from the same formalin-fixed, paraffin-embedded kidney samples by laser microdissection. Among the examined podocyte markers, the immunopositive area and mRNA expression level of both podoplanin and synaptopodin were decreased in PAN glomeruli. The immunopositive area of podocin showed a slight decrease in PAN glomeruli, while its mRNA level showed no change. We have also identified a novel podocyte injury marker β-enolase, which was increased exclusively by podocytes in PAN glomeruli, similarly to another widely used marker, desmin. Thus, we have shown the specific application of a state-of-the-art computational method and retrospective mRNA expression analysis to quantitatively study the changes of various podocyte markers. The proposed methods will open new avenues for quantitative elucidation of renal glomerular histopathology. Copyright © 2014 Elsevier GmbH. All rights reserved.

  15. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  16. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  17. Quantitative inverse modelling of a cylindrical object in the laboratory using ERT: An error analysis

    Science.gov (United States)

    Korteland, Suze-Anne; Heimovaara, Timo

    2015-03-01

    Electrical resistivity tomography (ERT) is a geophysical technique that can be used to obtain three-dimensional images of the bulk electrical conductivity of the subsurface. Because the electrical conductivity is strongly related to properties of the subsurface and the flow of water it has become a valuable tool for visualization in many hydrogeological and environmental applications. In recent years, ERT is increasingly being used for quantitative characterization, which requires more detailed prior information than a conventional geophysical inversion for qualitative purposes. In addition, the careful interpretation of measurement and modelling errors is critical if ERT measurements are to be used in a quantitative way. This paper explores the quantitative determination of the electrical conductivity distribution of a cylindrical object placed in a water bath in a laboratory-scale tank. Because of the sharp conductivity contrast between the object and the water, a standard geophysical inversion using a smoothness constraint could not reproduce this target accurately. Better results were obtained by using the ERT measurements to constrain a model describing the geometry of the system. The posterior probability distributions of the parameters describing the geometry were estimated with the Markov chain Monte Carlo method DREAM(ZS). Using the ERT measurements this way, accurate estimates of the parameters could be obtained. The information quality of the measurements was assessed by a detailed analysis of the errors. Even for the uncomplicated laboratory setup used in this paper, errors in the modelling of the shape and position of the electrodes and the shape of the domain could be identified. The results indicate that the ERT measurements have a high information content which can be accessed by the inclusion of prior information and the consideration of measurement and modelling errors.

  18. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  19. Modelling and Quantitative Analysis of LTRACK–A Novel Mobility Management Algorithm

    Directory of Open Access Journals (Sweden)

    Benedek Kovács

    2006-01-01

    Full Text Available This paper discusses the improvements and parameter optimization issues of LTRACK, a recently proposed mobility management algorithm. Mathematical modelling of the algorithm and the behavior of the Mobile Node (MN are used to optimize the parameters of LTRACK. A numerical method is given to determine the optimal values of the parameters. Markov chains are used to model both the base algorithm and the so-called loop removal effect. An extended qualitative and quantitative analysis is carried out to compare LTRACK to existing handover mechanisms such as MIP, Hierarchical Mobile IP (HMIP, Dynamic Hierarchical Mobility Management Strategy (DHMIP, Telecommunication Enhanced Mobile IP (TeleMIP, Cellular IP (CIP and HAWAII. LTRACK is sensitive to network topology and MN behavior so MN movement modelling is also introduced and discussed with different topologies. The techniques presented here can not only be used to model the LTRACK algorithm, but other algorithms too. There are many discussions and calculations to support our mathematical model to prove that it is adequate in many cases. The model is valid on various network levels, scalable vertically in the ISO-OSI layers and also scales well with the number of network elements.

  20. Modeling and Quantitative Analysis of GNSS/INS Deep Integration Tracking Loops in High Dynamics

    Directory of Open Access Journals (Sweden)

    Yalong Ban

    2017-09-01

    Full Text Available To meet the requirements of global navigation satellite systems (GNSS precision applications in high dynamics, this paper describes a study on the carrier phase tracking technology of the GNSS/inertial navigation system (INS deep integration system. The error propagation models of INS-aided carrier tracking loops are modeled in detail in high dynamics. Additionally, quantitative analysis of carrier phase tracking errors caused by INS error sources is carried out under the uniform high dynamic linear acceleration motion of 100 g. Results show that the major INS error sources, affecting the carrier phase tracking accuracy in high dynamics, include initial attitude errors, accelerometer scale factors, gyro noise and gyro g-sensitivity errors. The initial attitude errors are usually combined with the receiver acceleration to impact the tracking loop performance, which can easily cause the failure of carrier phase tracking. The main INS error factors vary with the vehicle motion direction and the relative position of the receiver and the satellites. The analysis results also indicate that the low-cost micro-electro mechanical system (MEMS inertial measurement units (IMU has the ability to maintain GNSS carrier phase tracking in high dynamics.

  1. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available of the various interacting components, a model to better understand the complexity in a network warfare environment would be beneficial. Non-quantitative modelling is a useful method to better characterize the field due to the rich ideas that can be generated...

  2. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    Science.gov (United States)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2016-07-01

    coverage. The predictions also provided insights into the EDII, in particular highlighting drought events where missing impact reports may reflect a lack of recording rather than true absence of impacts. Overall, the presented quantitative framework proved to be a useful tool for evaluating drought indicators, and to model impact occurrence. In summary, this study demonstrates the information gain for drought monitoring and early warning through impact data collection and analysis. It highlights the important role that quantitative analysis with impact data can have in providing "ground truth" for drought indicators, alongside more traditional stakeholder-led approaches.

  3. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    Directory of Open Access Journals (Sweden)

    S. Bachmair

    2015-09-01

    . The predictions also provided insights into the EDII, in particular highlighting drought events where missing impact reports reflect a lack of recording rather than true absence of impacts. Overall, the presented quantitative framework proved to be a useful tool for evaluating drought indicators, and to model impact occurrence. In summary, this study demonstrates the information gain for drought monitoring and early warning through impact data collection and analysis, and highlights the important role that quantitative analysis with impacts data can have in providing "ground truth" for drought indicators alongside more traditional stakeholder-led approaches.

  4. A quantitative analysis to objectively appraise drought indicators and model drought impacts

    Science.gov (United States)

    Bachmair, S.; Svensson, C.; Hannaford, J.; Barker, L. J.; Stahl, K.

    2015-09-01

    also provided insights into the EDII, in particular highlighting drought events where missing impact reports reflect a lack of recording rather than true absence of impacts. Overall, the presented quantitative framework proved to be a useful tool for evaluating drought indicators, and to model impact occurrence. In summary, this study demonstrates the information gain for drought monitoring and early warning through impact data collection and analysis, and highlights the important role that quantitative analysis with impacts data can have in providing "ground truth" for drought indicators alongside more traditional stakeholder-led approaches.

  5. Quantitative trait locus analysis of symbiotic nitrogen fixation activity in the model legume Lotus japonicus.

    Science.gov (United States)

    Tominaga, Akiyoshi; Gondo, Takahiro; Akashi, Ryo; Zheng, Shao-Hui; Arima, Susumu; Suzuki, Akihiro

    2012-05-01

    Many legumes form nitrogen-fixing root nodules. An elevation of nitrogen fixation in such legumes would have significant implications for plant growth and biomass production in agriculture. To identify the genetic basis for the regulation of nitrogen fixation, quantitative trait locus (QTL) analysis was conducted with recombinant inbred lines derived from the cross Miyakojima MG-20 × Gifu B-129 in the model legume Lotus japonicus. This population was inoculated with Mesorhizobium loti MAFF303099 and grown for 14 days in pods containing vermiculite. Phenotypic data were collected for acetylene reduction activity (ARA) per plant (ARA/P), ARA per nodule weight (ARA/NW), ARA per nodule number (ARA/NN), NN per plant, NW per plant, stem length (SL), SL without inoculation (SLbac-), shoot dry weight without inoculation (SWbac-), root length without inoculation (RLbac-), and root dry weight (RWbac-), and finally 34 QTLs were identified. ARA/P, ARA/NN, NW, and SL showed strong correlations and QTL co-localization, suggesting that several plant characteristics important for symbiotic nitrogen fixation are controlled by the same locus. QTLs for ARA/P, ARA/NN, NW, and SL, co-localized around marker TM0832 on chromosome 4, were also co-localized with previously reported QTLs for seed mass. This is the first report of QTL analysis for symbiotic nitrogen fixation activity traits.

  6. Gas chromatographic quantitative analysis of methanol in wine: operative conditions, optimization and calibration model choice.

    Science.gov (United States)

    Caruso, Rosario; Gambino, Grazia Laura; Scordino, Monica; Sabatino, Leonardo; Traulo, Pasqualino; Gagliano, Giacomo

    2011-12-01

    The influence of the wine distillation process on methanol content has been determined by quantitative analysis using gas chromatographic flame ionization (GC-FID) detection. A comparative study between direct injection of diluted wine and injection of distilled wine was performed. The distillation process does not affect methanol quantification in wines in proportions higher than 10%. While quantification performed on distilled samples gives more reliable results, a screening method for wine injection after a 1:5 water dilution could be employed. The proposed technique was found to be a compromise between the time consuming distillation process and direct wine injection. In the studied calibration range, the stability of the volatile compounds in the reference solution is concentration-dependent. The stability is higher in the less concentrated reference solution. To shorten the operation time, a stronger temperature ramp and carrier flow rate was employed. With these conditions, helium consumption and column thermal stress were increased. However, detection limits, calibration limits, and analytical method performances are not affected substantially by changing from normal to forced GC conditions. Statistical data evaluation were made using both ordinary (OLS) and bivariate least squares (BLS) calibration models. Further confirmation was obtained that limit of detection (LOD) values, calculated according to the 3sigma approach, are lower than the respective Hubaux-Vos (H-V) calculation method. H-V LOD depends upon background noise, calibration parameters and the number of reference standard solutions employed in producing the calibration curve. These remarks are confirmed by both calibration models used.

  7. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy...

  8. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  9. Quantitative trait locus analysis of multiple agronomic traits in the model legume Lotus japonicus.

    Science.gov (United States)

    Gondo, Takahiro; Sato, Shusei; Okumura, Kenji; Tabata, Satoshi; Akashi, Ryo; Isobe, Sachiko

    2007-07-01

    The first quantitative trait locus (QTL) analysis of multiple agronomic traits in the model legume Lotus japonicus was performed with a population of recombinant inbred lines derived from Miyakojima MG-20 x Gifu B-129. Thirteen agronomic traits were evaluated in 2004 and 2005: traits of vegetative parts (plant height, stem thickness, leaf length, leaf width, plant regrowth, plant shape, and stem color), flowering traits (flowering time and degree), and pod and seed traits (pod length, pod width, seeds per pod, and seed mass). A total of 40 QTLs were detected that explained 5%-69% of total variation. The QTL that explained the most variation was that for stem color, which was detected in the same region of chromosome 2 in both years. Some QTLs were colocated, especially those for pod and seed traits. Seed mass QTLs were located at 5 locations that mapped to the corresponding genomic positions of equivalent QTLs in soybean, pea, chickpea, and mung bean. This study provides fundamental information for breeding of agronomically important legume crops.

  10. Statistical Modeling Approach to Quantitative Analysis of Interobserver Variability in Breast Contouring

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jinzhong, E-mail: jyang4@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A.; Reed, Valerie K.; Strom, Eric A.; Perkins, George H.; Tereffe, Welela; Buchholz, Thomas A. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhang, Lifei; Balter, Peter; Court, Laurence E. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dong, Lei [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Scripps Proton Therapy Center, San Diego, California (United States)

    2014-05-01

    Purpose: To develop a new approach for interobserver variability analysis. Methods and Materials: Eight radiation oncologists specializing in breast cancer radiation therapy delineated a patient's left breast “from scratch” and from a template that was generated using deformable image registration. Three of the radiation oncologists had previously received training in Radiation Therapy Oncology Group consensus contouring for breast cancer atlas. The simultaneous truth and performance level estimation algorithm was applied to the 8 contours delineated “from scratch” to produce a group consensus contour. Individual Jaccard scores were fitted to a beta distribution model. We also applied this analysis to 2 or more patients, which were contoured by 9 breast radiation oncologists from 8 institutions. Results: The beta distribution model had a mean of 86.2%, standard deviation (SD) of ±5.9%, a skewness of −0.7, and excess kurtosis of 0.55, exemplifying broad interobserver variability. The 3 RTOG-trained physicians had higher agreement scores than average, indicating that their contours were close to the group consensus contour. One physician had high sensitivity but lower specificity than the others, which implies that this physician tended to contour a structure larger than those of the others. Two other physicians had low sensitivity but specificity similar to the others, which implies that they tended to contour a structure smaller than the others. With this information, they could adjust their contouring practice to be more consistent with others if desired. When contouring from the template, the beta distribution model had a mean of 92.3%, SD ± 3.4%, skewness of −0.79, and excess kurtosis of 0.83, which indicated a much better consistency among individual contours. Similar results were obtained for the analysis of 2 additional patients. Conclusions: The proposed statistical approach was able to measure interobserver variability quantitatively

  11. Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering

    Science.gov (United States)

    2014-06-01

    model of the system (Friendenthal, Moore and Steiner 2008, 17). The premise is that maintaining a logical and consistent model can be accomplished...Standard for Exchange of Product data (STEP) subgroup of ISO, and defines a standard data format for certain types of SE information ( Johnson 2006...search.credoreference.com/content/entry/encyccs/formal_languages/0. Friedenthal, Sanford, Alan Moore, and Rick Steiner . 2008. A Practical Guide to SysML

  12. Quantitative analysis of anaerobic oxidation of methane (AOM) in marine sediments: A modeling perspective

    Science.gov (United States)

    Regnier, P.; Dale, A. W.; Arndt, S.; LaRowe, D. E.; Mogollón, J.; Van Cappellen, P.

    2011-05-01

    Recent developments in the quantitative modeling of methane dynamics and anaerobic oxidation of methane (AOM) in marine sediments are critically reviewed. The first part of the review begins with a comparison of alternative kinetic models for AOM. The roles of bioenergetic limitations, intermediate compounds and biomass growth are highlighted. Next, the key transport mechanisms in multi-phase sedimentary environments affecting AOM and methane fluxes are briefly treated, while attention is also given to additional controls on methane and sulfate turnover, including organic matter mineralization, sulfur cycling and methane phase transitions. In the second part of the review, the structure, forcing functions and parameterization of published models of AOM in sediments are analyzed. The six-orders-of-magnitude range in rate constants reported for the widely used bimolecular rate law for AOM emphasizes the limited transferability of this simple kinetic model and, hence, the need for more comprehensive descriptions of the AOM reaction system. The derivation and implementation of more complete reaction models, however, are limited by the availability of observational data. In this context, we attempt to rank the relative benefits of potential experimental measurements that should help to better constrain AOM models. The last part of the review presents a compilation of reported depth-integrated AOM rates (ΣAOM). These rates reveal the extreme variability of ΣAOM in marine sediments. The model results are further used to derive quantitative relationships between ΣAOM and the magnitude of externally impressed fluid flow, as well as between ΣAOM and the depth of the sulfate-methane transition zone (SMTZ). This review contributes to an improved understanding of the global significance of the AOM process, and helps identify outstanding questions and future directions in the modeling of methane cycling and AOM in marine sediments.

  13. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  14. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  15. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  16. Meta-analysis of quantitative pleiotropic traits for next-generation sequencing with multivariate functional linear models.

    Science.gov (United States)

    Chiu, Chi-Yang; Jung, Jeesun; Chen, Wei; Weeks, Daniel E; Ren, Haobo; Boehnke, Michael; Amos, Christopher I; Liu, Aiyi; Mills, James L; Ting Lee, Mei-Ling; Xiong, Momiao; Fan, Ruzong

    2017-02-01

    To analyze next-generation sequencing data, multivariate functional linear models are developed for a meta-analysis of multiple studies to connect genetic variant data to multiple quantitative traits adjusting for covariates. The goal is to take the advantage of both meta-analysis and pleiotropic analysis in order to improve power and to carry out a unified association analysis of multiple studies and multiple traits of complex disorders. Three types of approximate F -distributions based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants. Simulation analysis is performed to evaluate false-positive rates and power of the proposed tests. The proposed methods are applied to analyze lipid traits in eight European cohorts. It is shown that it is more advantageous to perform multivariate analysis than univariate analysis in general, and it is more advantageous to perform meta-analysis of multiple studies instead of analyzing the individual studies separately. The proposed models require individual observations. The value of the current paper can be seen at least for two reasons: (a) the proposed methods can be applied to studies that have individual genotype data; (b) the proposed methods can be used as a criterion for future work that uses summary statistics to build test statistics to meta-analyze the data.

  17. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    Science.gov (United States)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (pchemotherapy.

  18. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  19. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    Science.gov (United States)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  20. Evaluating the validity of spectral calibration models for quantitative analysis following signal preprocessing.

    Science.gov (United States)

    Chen, Da; Grant, Edward

    2012-11-01

    When paired with high-powered chemometric analysis, spectrometric methods offer great promise for the high-throughput analysis of complex systems. Effective classification or quantification often relies on signal preprocessing to reduce spectral interference and optimize the apparent performance of a calibration model. However, less frequently addressed by systematic research is the affect of preprocessing on the statistical accuracy of a calibration result. The present work demonstrates the effectiveness of two criteria for validating the performance of signal preprocessing in multivariate models in the important dimensions of bias and precision. To assess the extent of bias, we explore the applicability of the elliptic joint confidence region (EJCR) test and devise a new means to evaluate precision by a bias-corrected root mean square error of prediction. We show how these criteria can effectively gauge the success of signal pretreatments in suppressing spectral interference while providing a straightforward means to determine the optimal level of model complexity. This methodology offers a graphical diagnostic by which to visualize the consequences of pretreatment on complex multivariate models, enabling optimization with greater confidence. To demonstrate the application of the EJCR criterion in this context, we evaluate the validity of representative calibration models using standard pretreatment strategies on three spectral data sets. The results indicate that the proposed methodology facilitates the reliable optimization of a well-validated calibration model, thus improving the capability of spectrophotometric analysis.

  1. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    Science.gov (United States)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    , the knowledge generated from these studies cannot be easily generalized or transferred to other basins. Here, we present an approach to integrate the quantitative and qualitative methods to study water issues and capture the contextual knowledge of water management- by combining the NSSs framework and an area of artificial intelligence called qualitative reasoning. Using the Apalachicola-Chattahoochee-Flint (ACF) River Basin dispute as an example, we demonstrate how quantitative modeling and qualitative reasoning can be integrated to examine the impact of over abstraction of water from the river on the ecosystem and the role of governance in shaping the evolution of the ACF water dispute.

  2. Derivation of a quantitative minimal model from a detailed elementary-step mechanism supported by mathematical coupling analysis

    Science.gov (United States)

    Shaik, O. S.; Kammerer, J.; Gorecki, J.; Lebiedz, D.

    2005-12-01

    Accurate experimental data increasingly allow the development of detailed elementary-step mechanisms for complex chemical and biochemical reaction systems. Model reduction techniques are widely applied to obtain representations in lower-dimensional phase space which are more suitable for mathematical analysis, efficient numerical simulation, and model-based control tasks. Here, we exploit a recently implemented numerical algorithm for error-controlled computation of the minimum dimension required for a still accurate reduced mechanism based on automatic time scale decomposition and relaxation of fast modes. We determine species contributions to the active (slow) dynamical modes of the reaction system and exploit this information in combination with quasi-steady-state and partial-equilibrium approximations for explicit model reduction of a novel detailed chemical mechanism for the Ru-catalyzed light-sensitive Belousov-Zhabotinsky reaction. The existence of a minimum dimension of seven is demonstrated to be mandatory for the reduced model to show good quantitative consistency with the full model in numerical simulations. We derive such a maximally reduced seven-variable model from the detailed elementary-step mechanism and demonstrate that it reproduces quantitatively accurately the dynamical features of the full model within a given accuracy tolerance.

  3. Quantitative image analysis of immunohistochemical stains using a CMYK color model

    Directory of Open Access Journals (Sweden)

    Iakovlev Vladimir

    2007-02-01

    Full Text Available Abstract Background Computer image analysis techniques have decreased effects of observer biases, and increased the sensitivity and the throughput of immunohistochemistry (IHC as a tissue-based procedure for the evaluation of diseases. Methods We adapted a Cyan/Magenta/Yellow/Key (CMYK model for automated computer image analysis to quantify IHC stains in hematoxylin counterstained histological sections. Results The spectral characteristics of the chromogens AEC, DAB and NovaRed as well as the counterstain hematoxylin were first determined using CMYK, Red/Green/Blue (RGB, normalized RGB and Hue/Saturation/Lightness (HSL color models. The contrast of chromogen intensities on a 0–255 scale (24-bit image file as well as compared to the hematoxylin counterstain was greatest using the Yellow channel of a CMYK color model, suggesting an improved sensitivity for IHC evaluation compared to other color models. An increase in activated STAT3 levels due to growth factor stimulation, quantified using the Yellow channel image analysis was associated with an increase detected by Western blotting. Two clinical image data sets were used to compare the Yellow channel automated method with observer-dependent methods. First, a quantification of DAB-labeled carbonic anhydrase IX hypoxia marker in 414 sections obtained from 138 biopsies of cervical carcinoma showed strong association between Yellow channel and positive color selection results. Second, a linear relationship was also demonstrated between Yellow intensity and visual scoring for NovaRed-labeled epidermal growth factor receptor in 256 non-small cell lung cancer biopsies. Conclusion The Yellow channel image analysis method based on a CMYK color model is independent of observer biases for threshold and positive color selection, applicable to different chromogens, tolerant of hematoxylin, sensitive to small changes in IHC intensity and is applicable to simple automation procedures. These characteristics

  4. Analysis of protein complexes through model-based biclustering of label-free quantitative AP-MS data.

    Science.gov (United States)

    Choi, Hyungwon; Kim, Sinae; Gingras, Anne-Claude; Nesvizhskii, Alexey I

    2010-06-22

    Affinity purification followed by mass spectrometry (AP-MS) has become a common approach for identifying protein-protein interactions (PPIs) and complexes. However, data analysis and visualization often rely on generic approaches that do not take advantage of the quantitative nature of AP-MS. We present a novel computational method, nested clustering, for biclustering of label-free quantitative AP-MS data. Our approach forms bait clusters based on the similarity of quantitative interaction profiles and identifies submatrices of prey proteins showing consistent quantitative association within bait clusters. In doing so, nested clustering effectively addresses the problem of overrepresentation of interactions involving baits proteins as compared with proteins only identified as preys. The method does not require specification of the number of bait clusters, which is an advantage against existing model-based clustering methods. We illustrate the performance of the algorithm using two published intermediate scale human PPI data sets, which are representative of the AP-MS data generated from mammalian cells. We also discuss general challenges of analyzing and interpreting clustering results in the context of AP-MS data.

  5. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  6. Bayesian methods for quantitative trait loci mapping based on model selection: approximate analysis using the Bayesian information criterion.

    Science.gov (United States)

    Ball, R D

    2001-11-01

    We describe an approximate method for the analysis of quantitative trait loci (QTL) based on model selection from multiple regression models with trait values regressed on marker genotypes, using a modification of the easily calculated Bayesian information criterion to estimate the posterior probability of models with various subsets of markers as variables. The BIC-delta criterion, with the parameter delta increasing the penalty for additional variables in a model, is further modified to incorporate prior information, and missing values are handled by multiple imputation. Marginal probabilities for model sizes are calculated, and the posterior probability of nonzero model size is interpreted as the posterior probability of existence of a QTL linked to one or more markers. The method is demonstrated on analysis of associations between wood density and markers on two linkage groups in Pinus radiata. Selection bias, which is the bias that results from using the same data to both select the variables in a model and estimate the coefficients, is shown to be a problem for commonly used non-Bayesian methods for QTL mapping, which do not average over alternative possible models that are consistent with the data.

  7. Quantitative evaluation of small breast masses using a compartment model analysis on dynamic MR imaging

    Energy Technology Data Exchange (ETDEWEB)

    Ikeda, Osamu; Morishita, Shoji; Kido, Taeko; Kitajima, Mika; Okamura, Kenji; Fukuda, Seiji [Kumamoto Rosai Hospital, Yatsushiro (Japan); Yamashita, Yasuyuki; Takahashi, Mutsumasa

    1998-07-01

    To differentiate between malignant and benign breast masses using a compartmental analysis, 55 patients with breast masses (fibroadenoma, n=22; invasive ductal carcinoma, n=29; noninvasive ductal carcinoma, n=8) underwent Gd-DTPA enhanced dynamic MR imaging. Dynamic MR images obtained using two-dimensional fat-saturated fast multiplanar corrupted gradient echo technique over 10 minutes following bolus injection of Gd-DTPA. The triexponential concentration curve of Gd-DTPA was fitted to a theoretical model based on compartmental analysis. Using this method, the transfer constant (or permeability surface product per unit volume of component k) and f{sub 3}/f{sub 1}=f were measured, where f{sub 1} represents tumor vessel volume and f{sub 3} represents extracellular volume. The k value was significantly greater (p<0.01) for malignant tumors, and the k value seen in cases of noninvasive ductal carcinoma was less than that for invasive ductal carcinoma. The f value was significantly smaller (p<0.01) for malignant tumors, whereas the f value for noninvasive ductal carcinoma was not significantly different from that for invasive ductal carcinoma. We believe that this type of compartmental analysis may be of value for the evaluation of breast masses. (author)

  8. Quantitative analysis and modelling of hepatic iron stores using stereology and spatial statistics.

    Science.gov (United States)

    Ghugre, N R; Gonzalez-Gomez, I; Shimada, H; Coates, T D; Wood, J C

    2010-06-01

    Hepatic iron overload is a common clinical problem resulting from hyperabsorption syndromes and from chronic transfusion therapy. Not only does iron loading vary between reticuloendothelial stores and hepatocytes, but iron is heterogeneously distributed within hepatocytes as well. Since the accessibility of iron particles to chelation may depend, in part, on their distribution, we sought to characterize the shape and scale of iron deposition in humans with transfusional iron overload. Toward this end, we performed a histological analysis of iron stores in liver biopsy specimens of 20 patients (1.3-57.8 mg iron/g dry tissue weight) with aid of electron and light microscopy. We estimated distributions related to variability in siderosomal size, proximity of iron centres and inter-cellular iron loading. These distributions could be well modelled by Gamma distribution functions over most of the pathologic range of iron concentrations. Thus, for a given liver iron burden, a virtual iron-overloaded liver could be created that served as a model for the true histologic appearance. Such a model may be helpful for understanding the mechanics of iron loading or in predicting response to iron removal therapy.

  9. Quantitative analysis of the brain-targeted delivery of drugs and model compounds using nano-delivery systems.

    Science.gov (United States)

    Kozlovskaya, Luba; Stepensky, David

    2013-10-10

    The blood-brain barrier (BBB) prevents drugs' permeability into the brain and limits management of brain diseases. Specialized drug delivery systems (DDSs) are utterly required to overcome this barrier and to achieve efficient delivery of therapeutic agents to the brain. For this purpose, drug-encapsulating nanoparticles or vesicles, drug conjugates and other types of DDSs are being developed by many research groups worldwide. However, efficiency of the brain drug/DDS delivery and targeting is usually presented in indirect and vague form and it is hard to quantitatively estimate it based on the reported data. We searched for the scientific papers that were published in 1970-2012 that reported delivery of drugs or model compounds to the brain following systemic administration of DDSs via parenteral routes and contained quantitative data on brain drug/DDS delivery and targeting efficiency. We identified 123 publications that matched the search criteria and analyzed their experimental settings, formulation types, analytical methods, and the claimed efficiencies of drug/DDS brain targeting (brain/plasma or brain/tissue concentration ratios) and brain accumulation (% of the administered dose that accumulated in the brain). Based on the outcomes of this analysis, we describe the major research trends, discuss the efficiencies of the different drug/DDS brain targeting approaches, and provide recommendations for quantitative assessment of brain-targeting DDSs in the appropriately designed studies. © 2013.

  10. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  11. An exercise to teach quantitative analysis and modeling using Excel-based analysis of the carbon cycle in the anthropocene

    Science.gov (United States)

    Stoll, Heather

    2013-04-01

    A computer modeling exercise was created to allows students to investigate the consequences of fossil fuel burning and land use change on the amount of carbon dioxide in the atmosphere. Students work with a simple numerical model of the carbon cycle which is rendered in Excel, and conduct a set of different sensitivity tests with different amounts and rate of C additions, and then graph and discuss their results. In the recommended approach, the model is provided to students without the biosphere and in class the formulas to integrate this module are typed into Excel simultaneously by instructor and students, helping students understand how the larger model is set up. In terms of content, students learn to recognize the redistribution of fossil fuel carbon between the ocean and atmosphere, and distinguish the consequences of rapid vs slow rates of addition of fossil fuel CO2 and the reasons for this difference. Students become familiar with the use of formulas in Excel and working with a large (300 rows, 20 columns) worksheet and gain competence in graphical representation of multiple scenarios. Students learn to appreciate the power and limitations of numerical models of complex cycles, the concept of inverse and forward models, and sensitivity tests. Finally, students learn that a reasonable hypothesis, may be "reasonable" but still not quantitatively sufficient - in this case, that the "Industrial Revolution" was not the source of increasing atmospheric CO2 from 1750-1900. The described activity is available to educators on the Teach the Earth portal of the Science Education Research Center (SERC) http://serc.carleton.edu/quantskills/activities/68751.html.

  12. Quantitative analysis of signal transduction in motile and phototactic cells by computerized light stimulation and model based tracking

    Science.gov (United States)

    Streif, Stefan; Staudinger, Wilfried Franz; Oesterhelt, Dieter; Marwan, Wolfgang

    2009-02-01

    To investigate the responses of Halobacterium salinarum to stimulation with light (phototaxis and photokinesis), we designed an experimental setup consisting of optical devices for automatic video image acquisition and computer-controlled light stimulation, and developed algorithms to analyze physiological responses of the cells. Cells are categorized as motile and nonmotile by a classification scheme based on the square displacement of cell positions. Computerized tracking based on a dynamic model of the stochastic cell movement and a Kalman filter-based algorithm allows smoothed estimates of the cell tracks and the detection of physiological responses to complex stimulus patterns. The setup and algorithms were calibrated which allows quantitative measurements and systematic analysis of cellular sensing and response. Overall, the setup is flexible, extensible, and consists mainly of commercially available products. This facilitates modifications of the setup and algorithms for physiological studies of the motility of cells or microorganisms.

  13. Quantitative analysis of signal transduction in motile and phototactic cells by computerized light stimulation and model based tracking.

    Science.gov (United States)

    Streif, Stefan; Staudinger, Wilfried Franz; Oesterhelt, Dieter; Marwan, Wolfgang

    2009-02-01

    To investigate the responses of Halobacterium salinarum to stimulation with light (phototaxis and photokinesis), we designed an experimental setup consisting of optical devices for automatic video image acquisition and computer-controlled light stimulation, and developed algorithms to analyze physiological responses of the cells. Cells are categorized as motile and nonmotile by a classification scheme based on the square displacement of cell positions. Computerized tracking based on a dynamic model of the stochastic cell movement and a Kalman filter-based algorithm allows smoothed estimates of the cell tracks and the detection of physiological responses to complex stimulus patterns. The setup and algorithms were calibrated which allows quantitative measurements and systematic analysis of cellular sensing and response. Overall, the setup is flexible, extensible, and consists mainly of commercially available products. This facilitates modifications of the setup and algorithms for physiological studies of the motility of cells or microorganisms.

  14. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Science.gov (United States)

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  15. A suite of models to support the quantitative assessment of spread in pest risk analysis.

    Directory of Open Access Journals (Sweden)

    Christelle Robinet

    Full Text Available Pest Risk Analyses (PRAs are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens. Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.

  16. Quantitative biokinetic analysis of radioactively labelled, inhaled Titanium dioxide Nanoparticles in a rat model

    Energy Technology Data Exchange (ETDEWEB)

    Kreyling, Wolfgang G.; Wenk, Alexander; Semmler-Behnke, Manuela [Helmholtz Zentrum Muenchen, Deutsches Forschungszentrum fuer Gesundheit und Umwelt GmbH (Germany). Inst. fuer Lungenbiologie und Erkrankungen, Netzwerk Nanopartikel und Gesundheit

    2010-09-15

    The aim of this project was the determination of the biokinetics of TiO{sub 2} nanoparticles (NP) in the whole body of healthy adult rats after NP administration to the respiratory tract - either via inhalation or instillation. We developed an own methodology to freshly synthesize and aerosolize TiO{sub 2}-NP in our lab for the use of inhalation studies. These NP underwent a detailed physical and chemical characterization providing pure polycrystalline anatase TiO{sub 2}-NP of about 20 nm (geometric standard deviation 1.6) and a specific surface area of 270 m{sup 2}/g. In addition, we developed techniques for sufficiently stable radioactive {sup 48}V labelling of the TiO{sub 2} NP. The kinetics of solubility of {sup 48}V was thoroughly determined. The methodology of quantitative biokinetics allows for a quantitative balance of the retained and excreted NP in control of the administered NP dose and provides a much more precise determination of NP fractions and concentrations of NP in organs and tissues of interest as compared to spotting biokinetics studies. Small fractions of TiO{sub 2}-NP translocate across the air-blood-barrier and accumulate in secondary target organs, soft tissue and skeleton. The amount of translocated TiO{sub 2}-NP is approximately 2% of TiO{sub 2}-NP deposited in the lungs. A prominent fraction of these translocated TiO{sub 2}-NP was found in the remainder. Smaller amounts of TiO{sub 2}-NP accumulate in secondary organs following particular kinetics. TiO{sub 2}-NP translocation was grossly accomplished within the first 2-4 hours after inhalation followed by retention in all organs and tissues studied without any detectable clearance of these biopersistent TiO{sub 2}-NP within 28 days. Therefore, our data suggest crossing of the air-blood-barrier of the lungs and subsequent accumulation in secondary organs and tissues depends on the NP material and its physico-chemical properties. Furthermore, we extrapolate that during repeated or chronic

  17. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  18. Two-Stage Analysis on Models for Quantitative Differentiation of Early-Pathological Bladder States

    Directory of Open Access Journals (Sweden)

    Nina Kalyagina

    2014-01-01

    Full Text Available A mathematical simulation method was developed for visualization of the diffuse reflected light on a surface of 3-layered models of urinary bladder wall. Five states, from normal to precancerous, of the urinary bladder epithelium were simulated. With the use of solutions of classical electrodynamics equations, scattering coefficients μs and asymmetry parameters g of the bladder epithelium were found in order to perform Monte Carlo calculations. The results, compared with the experimental studies, has revealed the influence of the changes in absorption and scattering properties on diffuse-reflectance signal distributions on the surfaces of the modelled media.

  19. Quantitative strain analysis in analogue modelling experiments: insights from X-ray computed tomography and tomographic image correlation

    Science.gov (United States)

    Adam, J.; Klinkmueller, M.; Schreurs, G.; Wieneke, B.

    2009-04-01

    The combination of scaled analogue modelling experiments, advanced research in analogue material mechanics (Lohrmann et al. 2003, Panien et al. 2006), X-ray computed tomography and new high-resolution deformation monitoring techniques (2D/3D Digital Image Correlation) is a new powerful tool not only to examine the evolution and interaction of faulting in analogue models, but also to evaluate relevant controlling factors such as mechanics, sedimentation, erosion and climate. This is of particular interest for applied problems in the energy sector (e.g., structurally complex reservoirs, LG & CO2 underground storage) because the results are essential for geological and seismic interpretation as well as for more realistically constrained fault/fracture simulations and reservoir characterisation. X-ray computed tomography (CT) analysis has been successfully applied to analogue models since the late 1980s. This technique permits visualisation of the interior of an analogue model without destroying it. Technological improvements have resulted in more powerful X-ray CT scanners that allow periodic acquisition of volumetric data sets thus making it possible to follow the 3-D evolution of the model structures with time (e.g. Schreurs et al., 2002, 2003). Optical strain monitoring (Digital Image Correlation, DIC) in analogue experiments (Adam et al., 2005) represents an important advance in quantitative physical modelling and in helping to understand non-linear rock deformation processes. Optical non-intrusive 2D/3D strain and surface flow analysis by DIC is a new methodology in physical modelling that enables the complete quantification of localised and distributed model deformation. The increase in spatial/temporal strain data resolution of several orders of magnitude makes physical modelling - used for decades to visualize the kinematic processes of geological deformation processes - a unique research tool to determine what fundamental physical processes control tectonic

  20. Quantitative analysis of anaerobic oxidation of methane (AOM) in marine sediments: a modeling perspective

    NARCIS (Netherlands)

    Regnier, P.; Dale, A.W.; Arndt, S.; LaRowe, D.E.; Mogollon, J.M.; Van Cappellen, P.

    2011-01-01

    Recent developments in the quantitativemodeling of methane dynamics and anaerobic oxidation of methane (AOM) in marine sediments are critically reviewed. The first part of the review begins with a comparison of alternative kinetic models for AOM. The roles of bioenergetic limitations, intermediate c

  1. Diagnosis of prostate cancer by quantitative analysis of 3DMRSI data: A new model

    Institute of Scientific and Technical Information of China (English)

    QUAN Hong; WANG Xiaoying; BAO Shanglian; WANG Huiliang; LI Feiyu; HUANG Rong

    2005-01-01

    Three-dimensional magnetic resonance spectroscopic imaging (3DMRSI) is helpful to identify prostate cancer (PC)from benign prostate hyperplasia (BPH) and to show the distribution of tumor infiltration. Combined with the (Cho + Cre)/Cit ratio, the z-score model can effectively discriminate prostate cancer from stromal benign prostate hyperplasia (sBPH) and detect small malignant lesions (SML).

  2. Quantitative Analysis Linking Inner Hair Cell Voltage Changes and Postsynaptic Conductance Change: A Modelling Study

    Directory of Open Access Journals (Sweden)

    Andreas N. Prokopiou

    2015-01-01

    Full Text Available This paper presents a computational model which estimates the postsynaptic conductance change of mammalian Type I afferent peripheral process when airborne acoustic waves impact on the tympanic membrane. A model of the human auditory periphery is used to estimate the inner hair cell potential change in response to airborne sound. A generic and tunable topology of the mammalian synaptic ribbon is generated and the voltage dependence of its substructures is used to calculate discrete and probabilistic neurotransmitter vesicle release. Results suggest an almost linear relationship between increasing sound level (in dB SPL and the postsynaptic conductance for frequencies considered too high for neurons to phase lock with (i.e., a few kHz. Furthermore coordinated vesicle release is shown for up to 300–400 Hz and a mechanism of phase shifting the subharmonic content of a stimulating signal is suggested. Model outputs suggest that strong onset response and highly synchronised multivesicular release rely on compound fusion of ribbon tethered vesicles.

  3. Heat strain imposed by personal protective ensembles: quantitative analysis using a thermoregulation model

    Science.gov (United States)

    Xu, Xiaojiang; Gonzalez, Julio A.; Santee, William R.; Blanchard, Laurie A.; Hoyt, Reed W.

    2016-07-01

    The objective of this paper is to study the effects of personal protective equipment (PPE) and specific PPE layers, defined as thermal/evaporative resistances and the mass, on heat strain during physical activity. A stepwise thermal manikin testing and modeling approach was used to analyze a PPE ensemble with four layers: uniform, ballistic protection, chemical protective clothing, and mask and gloves. The PPE was tested on a thermal manikin, starting with the uniform, then adding an additional layer in each step. Wearing PPE increases the metabolic rates (dot{M}) , thus dot{M} were adjusted according to the mass of each of four configurations. A human thermoregulatory model was used to predict endurance time for each configuration at fixed dot{M} and at its mass adjusted dot{M} . Reductions in endurance time due to resistances, and due to mass, were separately determined using predicted results. Fractional contributions of PPE's thermal/evaporative resistances by layer show that the ballistic protection and the chemical protective clothing layers contribute about 20 %, respectively. Wearing the ballistic protection over the uniform reduced endurance time from 146 to 75 min, with 31 min of the decrement due to the additional resistances of the ballistic protection, and 40 min due to increased dot{M} associated with the additional mass. Effects of mass on heat strain are of a similar magnitude relative to effects of increased resistances. Reducing resistances and mass can both significantly alleviate heat strain.

  4. The Quantitative Analysis of User Behavior Online - Data, Models and Algorithms

    Science.gov (United States)

    Raghavan, Prabhakar

    By blending principles from mechanism design, algorithms, machine learning and massive distributed computing, the search industry has become good at optimizing monetization on sound scientific principles. This represents a successful and growing partnership between computer science and microeconomics. When it comes to understanding how online users respond to the content and experiences presented to them, we have more of a lacuna in the collaboration between computer science and certain social sciences. We will use a concrete technical example from image search results presentation, developing in the process some algorithmic and machine learning problems of interest in their own right. We then use this example to motivate the kinds of studies that need to grow between computer science and the social sciences; a critical element of this is the need to blend large-scale data analysis with smaller-scale eye-tracking and "individualized" lab studies.

  5. Quantitative analysis of irreversibilities causes voltage drop in fuel cell (simulation and modeling)

    Energy Technology Data Exchange (ETDEWEB)

    Ghadamian, Hossein [Azad Univ., Dept. of Energy Engineering, Tehran (Iran); Saboohi, Yadolah [Sharif Energy Research Inst. (SERI), Tehran (Iran)

    2004-11-30

    Power level of a fuel cell depends on its operating condition, which is product of voltage and current-density the highest level of voltage is identified as reversible open circuit voltage (ROCV), which represents an ideal theoretical case [J. Larmin, A. Dicks, Fuel Cell System Explained, Wiley, 2000 (ISBN)]. Compared to that is ideal operating voltage which is usually characterized as open circuit voltage (OCV). An evaluation of deviation of operating voltage level from ideal operational case may provide information on the extent of improving efficiency and energy efficiency of a fuel cell. Therefore, quantification of operation deviation from OCV is the main point that is discussed in the present paper. The analysis procedure of voltage drop is based on step-by-step review of voltage drops over activation, internal currents (fuel-cross-over), Ohmic and mass-transport or concentration losses. Accumulated total voltage drops would be estimated as a sum of aforementioned losses. The accumulated voltage drops will then be reduced from OCV to obtain the operating voltage level. The above numerical analysis has been applied to identify the extents of voltage drop. The possible reducing variables in voltage drops reviewed and concluded that the activation loss has considerable impact on total voltage drops and it explains the most part of total losses. It is also found that the following correspondence parameters cause decrease in voltage drops: 1. Temperature increasing; 2. Pressure increasing; 3. Hydrogen or oxygen concentration increasing; 4. Electrode effective surface increasing; 5. Electrode and electrolyte, conductivity modification; 6. Electrolyte thickness reducing up to possible limitation; 7. Connection modification. (Author)

  6. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases Quantitative analysis of strategies to achieve the 2020 goals for neglected tropical diseases: Where are we now?

    NARCIS (Netherlands)

    T.D. Hollingsworth (T. Déirdre); E.R. Adams (Emily R.); R.M. Anderson (Roy M.); K. Atkins (Katherine); S. Bartsch (Sarah); M-G. Basáñez (María-Gloria); M. Behrend (Matthew); D.J. Blok (David); L.A.C. Chapman (Lloyd A. C.); L.E. Coffeng (Luc); O. Courtenay (Orin); R.E. Crump (Ron E.); S.J. de Vlas (Sake); A.P. Dobson (Andrew); L. Dyson (Louise); H. Farkas (Hajnal); A.P. Galvani (Alison P.); M. Gambhir (Manoj); D. Gurarie (David); M.A. Irvine (Michael A.); S. Jervis (Sarah); M.J. Keeling (Matt J.); L. Kelly-Hope (Louise); C. King (Charles); B.Y. Lee (Bruce Y.); E.A. le Rutte (Epke A.); T.M. Lietman (Thomas M.); M. Ndeffo-Mbah (Martial); G.F. Medley (Graham F.); E. Michael (Edwin); A. Pandey (Abhishek); J.K. Peterson (Jennifer K.); A. Pinsent (Amy); T.C. Porco (Travis C.); J.H. Richardus (Jan Hendrik); L. Reimer (Lisa); K.S. Rock (Kat S.); B.K. Singh (Brajendra K.); W.A. Stolk (Wilma); S. Swaminathan (Subramanian); S.J. Torr (Steve J.); J. Townsend (Jeffrey); J. Truscott (James); M. Walker (Martin); A. Zoueva (Alexandra)

    2015-01-01

    textabstractQuantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an ov

  7. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases Quantitative analysis of strategies to achieve the 2020 goals for neglected tropical diseases: Where are we now?

    NARCIS (Netherlands)

    T.D. Hollingsworth (T. Déirdre); E.R. Adams (Emily R.); R.M. Anderson (Roy M.); K. Atkins (Katherine); S. Bartsch (Sarah); M-G. Basáñez (María-Gloria); M. Behrend (Matthew); D.J. Blok (David); L.A.C. Chapman (Lloyd A. C.); L.E. Coffeng (Luc); O. Courtenay (Orin); R.E. Crump (Ron E.); S.J. de Vlas (Sake); A.P. Dobson (Andrew); L. Dyson (Louise); H. Farkas (Hajnal); A.P. Galvani (Alison P.); M. Gambhir (Manoj); D. Gurarie (David); M.A. Irvine (Michael A.); S. Jervis (Sarah); M.J. Keeling (Matt J.); L. Kelly-Hope (Louise); C. King (Charles); B.Y. Lee (Bruce Y.); E.A. le Rutte (Epke); T.M. Lietman (Thomas M.); M. Ndeffo-Mbah (Martial); G.F. Medley (Graham F.); E. Michael (Edwin); A. Pandey (Abhishek); J.K. Peterson (Jennifer K.); A. Pinsent (Amy); T.C. Porco (Travis C.); J.H. Richardus (Jan Hendrik); L. Reimer (Lisa); K.S. Rock (Kat S.); B.K. Singh (Brajendra K.); W.A. Stolk (Wilma); S. Swaminathan (Subramanian); S.J. Torr (Steve J.); J. Townsend (Jeffrey); J. Truscott (James); M. Walker (Martin); A. Zoueva (Alexandra)

    2015-01-01

    textabstractQuantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an ov

  8. Quantitative analysis of glycated proteins.

    Science.gov (United States)

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  9. The TX-model - a quantitative heat loss analysis of district heating pipes by means of IR surface temperature measurements

    Energy Technology Data Exchange (ETDEWEB)

    Zinki, Heimo [ZW Energiteknik, Nykoeping (Sweden)

    1996-11-01

    The aim of this study was to investigate the possibility of analysing the temperature profile at the ground surface above buried district heating pipes in such a way that would enable the quantitative determination of heat loss from the pair of pipes. In practical applications, it is supposed that this temperature profile is generated by means of advanced IR-thermography. For this purpose, the principle of the TX - model has been developed, based on the fact that the heat losses from pipes buried in the ground have a temperature signature on the ground surface. Qualitative analysis of this temperature signature is very well known and in practical use for detecting leaks from pipes. These techniques primarily make use of relative changes of the temperature pattern along the pipe. In the quantitative heat loss analysis, however, it is presumed that the temperature profile across the pipes is related to the pipe heat loss per unit length. The basic idea is that the integral of the temperature profile perpendicular to the pipe, called TX, is a function of the heat loss, but is also affected by other parameters such as burial depth, heat diffusivity, wind, precipitation and so on. In order to analyse the parameters influencing the TX- factor, a simulation model for the energy balance at the ground surface has been developed. This model includes the heat flow from the pipe to the surface and the heat exchange at the surface with the environment due to convection, latent heat change, solar and long wave radiation. The simulation gives the surprising result that the TX factor is by and large unaffected during the course of a day even when the sun is shining, as long as other climate conditions are relatively stable (low wind, no rain, no shadows). The results from the simulations were verified at different sites in Denmark, Finland, Sweden and USA through a co-operative research program organised and partially financed by the IEA District Heating Programme, Task III, and

  10. Review time in peer review: quantitative analysis and modelling of editorial workflows.

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Agata; Fronczak, Piotr; Nedic, Olgica; Ausloos, Marcel

    In this paper, we undertake a data-driven theoretical investigation of editorial workflows. We analyse a dataset containing information about 58 papers submitted to the Biochemistry and Biotechnology section of the Journal of the Serbian Chemical Society. We separate the peer review process into stages that each paper has to go through and introduce the notion of completion rate - the probability that an invitation sent to a potential reviewer will result in a finished review. Using empirical transition probabilities and probability distributions of the duration of each stage we create a directed weighted network, the analysis of which allows us to obtain the theoretical probability distributions of review time for different classes of reviewers. These theoretical distributions underlie our numerical simulations of different editorial strategies. Through these simulations, we test the impact of some modifications of the editorial policy on the efficiency of the whole review process. We discover that the distribution of review time is similar for all classes of reviewers, and that the completion rate of reviewers known personally by the editor is very high, which means that they are much more likely to answer the invitation and finish the review than other reviewers. Thus, the completion rate is the key factor that determines the efficiency of each editorial policy. Our results may be of great importance for editors and act as a guide in determining the optimal number of reviewers.

  11. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  12. A quantitative approach to scar analysis.

    Science.gov (United States)

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. Copyright © 2011 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  13. Defect evolution in cosmology and condensed matter quantitative analysis with the velocity-dependent one-scale model

    CERN Document Server

    Martins, C J A P

    2016-01-01

    This book sheds new light on topological defects in widely differing systems, using the Velocity-Dependent One-Scale Model to better understand their evolution. Topological defects – cosmic strings, monopoles, domain walls or others - necessarily form at cosmological (and condensed matter) phase transitions. If they are stable and long-lived they will be fossil relics of higher-energy physics. Understanding their behaviour and consequences is a key part of any serious attempt to understand the universe, and this requires modelling their evolution. The velocity-dependent one-scale model is the only fully quantitative model of defect network evolution, and the canonical model in the field. This book provides a review of the model, explaining its physical content and describing its broad range of applicability.

  14. Quantitative spectroscopy of extreme helium stars Model atmospheres and a non-LTE abundance analysis of BD+10°2179

    Science.gov (United States)

    Kupfer, T.; Przybilla, N.; Heber, U.; Jeffery, C. S.; Behara, N. T.; Butler, K.

    2017-10-01

    Extreme helium stars (EHe stars) are hydrogen-deficient supergiants of spectral type A and B. They are believed to result from mergers in double degenerate systems. In this paper, we present a detailed quantitative non-LTE spectral analysis for BD+10°2179, a prototype of this rare class of stars, using UV-Visual Echelle Spectrograph and Fiber-fed Extended Range Optical Spectrograph spectra covering the range from ∼3100 to 10 000 Å. Atmosphere model computations were improved in two ways. First, since the UV metal line blanketing has a strong impact on the temperature-density stratification, we used the atlas12 code. Additionally, We tested atlas12 against the benchmark code sterne3, and found only small differences in the temperature and density stratifications, and good agreement with the spectral energy distributions. Secondly, 12 chemical species were treated in non-LTE. Pronounced non-LTE effects occur in individual spectral lines but, for the majority, the effects are moderate to small. The spectroscopic parameters give Teff =17 300±300 K and log g = 2.80±0.10, and an evolutionary mass of 0.55±0.05 M⊙. The star is thus slightly hotter, more compact and less massive than found in previous studies. The kinematic properties imply a thick-disc membership, which is consistent with the metallicity [Fe/H] ≈ -1 and α-enhancement. The refined light-element abundances are consistent with the white dwarf merger scenario. We further discuss the observed helium spectrum in an appendix, detecting dipole-allowed transitions from about 150 multiplets plus the most comprehensive set of known/predicted isolated forbidden components to date. Moreover, a so far unreported series of pronounced forbidden He I components is detected in the optical-UV.

  15. Quantitative magnetic resonance analysis and a morphometric predictive model reveal lean body mass changes in migrating Nearctic-Neotropical passerines.

    Science.gov (United States)

    Seewagen, Chad L; Guglielmo, Christopher G

    2011-04-01

    Most studies of lean mass dynamics in free-living passerine birds have focused on Old World species at geographical barriers where they are challenged to make the longest non-stop flight of their migration. We examined lean mass variation in New World passerines in an area where the distribution of stopover habitat does not require flights to exceed more than a few hours and most migrants stop flying well before fat stores near exhaustion. We used either quantitative magnetic resonance (QMR) analysis or a morphometric model to measure or estimate, respectively, the fat and lean body mass of migrants during stopovers in New York, USA. With these data, we examined (1) variance in total body mass explained by lean body mass, (2) hourly rates of fat and lean body mass change in single-capture birds, and (3) net changes in fat and lean mass in recaptured birds. Lean mass contributed to 50% of the variation in total body mass among white-throated sparrows Zonotrichia albicollis and hermit thrushes Catharus guttatus. Lean mass of refueling gray catbirds Dumetella carolinensis and white-throated sparrows, respectively, increased 1.123 and 0.320 g h(-1). Lean mass of ovenbirds Seiurus aurocapillus accounted for an estimated 33-40% of hourly gains in total body mass. On average 35% of the total mass gained among recaptured birds was lean mass. Substantial changes in passerine lean mass are not limited to times when birds are forced to make long, non-stop flights across barriers. Protein usage during migration is common across broad taxonomic groups, migration systems, and migration strategies.

  16. 2D Hydrodynamic Based Logic Modeling Tool for River Restoration Decision Analysis: A Quantitative Approach to Project Prioritization

    Science.gov (United States)

    Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.

    2014-12-01

    In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is

  17. Hidden Markov Model for quantitative prediction of snowfall and analysis of hazardous snowfall events over Indian Himalaya

    Science.gov (United States)

    Joshi, J. C.; Tankeshwar, K.; Srivastava, Sunita

    2017-04-01

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992-2012. There are six observations and six states of the model. The most probable observation and state sequence has been computed using Forward and Viterbi algorithms, respectively. Baum-Welch algorithm has been used for optimizing the model parameters. The model has been validated for two winters (2012-2013 and 2013-2014) by computing root mean square error (RMSE), accuracy measures such as percent correct (PC), critical success index (CSI) and Heidke skill score (HSS). The RMSE of the model has also been calculated using leave-one-out cross-validation method. Snowfall predicted by the model during hazardous snowfall events in different parts of the Himalaya matches well with the observed one. The HSS of the model for all the stations implies that the optimized model has better forecasting skill than random forecast for both the days. The RMSE of the optimized model has also been found smaller than the persistence forecast and standard deviation for both the days.

  18. Hidden Markov Model for quantitative prediction of snowfall and analysis of hazardous snowfall events over Indian Himalaya

    Indian Academy of Sciences (India)

    J C Joshi; K Tankeshwar; Sunita Srivastava

    2017-04-01

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six observations and six states of the model. The most probable observation and state sequence has been computed using Forward and Viterbi algorithms, respectively. Baum–Welch algorithm has been used for optimizing the model parameters. The model has been validated for two winters (2012–2013 and 2013–2014) by computing root mean square error (RMSE), accuracy measures such as percent correct (PC), critical success index (CSI) and Heidke skill score (HSS). The RMSE of the model has also been calculated using leave-one-out cross-validation method. Snowfall predicted by the model during hazardous snowfall events in different parts of the Himalaya matches well with the observed one. The HSS of the model for all the stations implies that the optimized model has better forecasting skill than random forecast for both the days. The RMSE of the optimized model has also been found smaller than the persistence forecast and standard deviation for both the days.

  19. Quantitative analysis of surface deformation and ductile flow in complex analogue geodynamic models based on PIV method.

    Science.gov (United States)

    Krýza, Ondřej; Lexa, Ondrej; Závada, Prokop; Schulmann, Karel; Gapais, Denis; Cosgrove, John

    2017-04-01

    Recently, a PIV (particle image velocimetry) analysis method is optical method abundantly used in many technical branches where material flow visualization and quantification is important. Typical examples are studies of liquid flow through complex channel system, gas spreading or combustion problematics. In our current research we used this method for investigation of two types of complex analogue geodynamic and tectonic experiments. First class of experiments is aimed to model large-scale oroclinal buckling as an analogue of late Paleozoic to early Mesozoic evolution of Central Asian Orogenic Belt (CAOB) resulting from nortward drift of the North-China craton towards the Siberian craton. Here we studied relationship between lower crustal and lithospheric mantle flows and upper crustal deformation respectively. A second class of experiments is focused to more general study of a lower crustal flow in indentation systems that represent a major component of some large hot orogens (e.g. Bohemian massif). The most of simulations in both cases shows a strong dependency of a brittle structures shape, that are situated in upper crust, on folding style of a middle and lower ductile layers which is influenced by rheological, geometrical and thermal conditions of different parts across shortened domain. The purpose of PIV application is to quantify material redistribution in critical domains of the model. The derivation of flow direction and calculation of strain-rate and total displacement field in analogue experiments is generally difficult and time-expensive or often performed only on a base of visual evaluations. PIV method operates with set of images, where small tracer particles are seeded within modeled domain and are assumed to faithfully follow the material flow. On base of pixel coordinates estimation the material displacement field, velocity field, strain-rate, vorticity, tortuosity etc. are calculated. In our experiments we used velocity field divergence to

  20. Building a Database for a Quantitative Model

    Science.gov (United States)

    Kahn, C. Joseph; Kleinhammer, Roger

    2014-01-01

    A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.

  1. Is the Linear Modeling Technique Good Enough for Optimal Form Design? A Comparison of Quantitative Analysis Models

    Directory of Open Access Journals (Sweden)

    Yang-Cheng Lin

    2012-01-01

    Full Text Available How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers’ perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique, and neural networks (the nonlinear modeling technique to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers’ perception of product image and product form elements of personal digital assistants (PDAs. The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.

  2. Is the linear modeling technique good enough for optimal form design? A comparison of quantitative analysis models.

    Science.gov (United States)

    Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun

    2012-01-01

    How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.

  3. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  4. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  5. Qualitative and simultaneous quantitative analysis of cimetidine polymorphs by ultraviolet-visible and shortwave near-infrared diffuse reflectance spectroscopy and multivariate calibration models.

    Science.gov (United States)

    Feng, Yuyan; Li, Xiangling; Xu, Kailin; Zou, Huayu; Li, Hui; Liang, Bing

    2015-02-01

    The object of the present study was to investigate the feasibility of applying ultraviolet-visible and shortwave near-infrared diffuse reflectance spectroscopy (UV-vis-SWNIR DRS) coupled with chemometrics in qualitative and simultaneous quantitative analysis of drug polymorphs, using cimetidine as a model drug. Three polymorphic forms (A, B and D) and a mixed crystal (M1) of cimetidine, obtained by preparation under different crystallization conditions, were characterized by microscopy, X-ray powder diffraction (XRPD) and infrared spectroscopy (IR). The discriminant models of four forms (A, B, D and M1) were established by discriminant partial least squares (PLS-DA) using different pretreated spectra. The R and RMSEP of samples in the prediction set by discriminant model with original spectra were 0.9959 and 0.1004. Among the quantitative models of binary mixtures (A and D) established by partial least squares (PLS) and least squares-support vector machine (LS-SVM) with different pretreated spectra, the LS-SVM models based on original and MSC spectra had better prediction effect with a R of 1.0000 and a RMSEP of 0.0134 for form A, and a R of 1.0000 and a RMSEP of 0.0024 for form D. For ternary mixtures, the established PLS quantitative models based on normalized spectra had relatively better prediction effect for forms A, B and D with R of 0.9901, 0.9820 and 0.9794 and RMSEP of 0.0471, 0.0529 and 0.0594, respectively. This research indicated that UV-vis-SWNIR DRS can be used as a simple, rapid, nondestructive qualitative and quantitative method for the analysis of drug polymorphs. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Inhibition of bacterial conjugation by phage M13 and its protein g3p: quantitative analysis and model.

    Directory of Open Access Journals (Sweden)

    Abraham Lin

    Full Text Available Conjugation is the main mode of horizontal gene transfer that spreads antibiotic resistance among bacteria. Strategies for inhibiting conjugation may be useful for preserving the effectiveness of antibiotics and preventing the emergence of bacterial strains with multiple resistances. Filamentous bacteriophages were first observed to inhibit conjugation several decades ago. Here we investigate the mechanism of inhibition and find that the primary effect on conjugation is occlusion of the conjugative pilus by phage particles. This interaction is mediated primarily by phage coat protein g3p, and exogenous addition of the soluble fragment of g3p inhibited conjugation at low nanomolar concentrations. Our data are quantitatively consistent with a simple model in which association between the pili and phage particles or g3p prevents transmission of an F plasmid encoding tetracycline resistance. We also observe a decrease in the donor ability of infected cells, which is quantitatively consistent with a reduction in pili elaboration. Since many antibiotic-resistance factors confer susceptibility to phage infection through expression of conjugative pili (the receptor for filamentous phage, these results suggest that phage may be a source of soluble proteins that slow the spread of antibiotic resistance genes.

  7. QUANTITATIVE ANALYSIS AND FRACTAL MODELING ON THE MOSAIC STRUCTURE OF LANDSCAPE IN THE CENTRAL AREA OF SHANGHAI METROPOLIS

    Institute of Scientific and Technical Information of China (English)

    XU Jian-hua; AI Nan-shan; CHEN Yong; MEI An-xin; LIAO Hong-juan

    2003-01-01

    The mosaic structure of landscape of the central area of Shanghai Metropolis is studied by quantitative methods of landscape ecology based on Remote Sensing (RS) and Geographic Information System (GIS) in this pa-per. Firstly, landscapes are classified into eight categories: residential quarter, industrial quarter, road, other urban landscape, farmland, village and small town, on-building area, river and other water bodies (such as lake, etc.). Sec-ondly, a GIS is designed and set up based on the remote sensing data and field investigation, and a digital map of landscape mosaic is made. Then the indexes of diversity, dominance, fragmentation and isolation, and fractal dimen-sion of each type of landscape in different periods are calculated by using spatial analysis method of GIS. With refer-ence to the calculated results, a series of relative issues are discussed.

  8. Quantitative analysis of SILCs (stress-induced leakage currents) based on the inelastic trap-assisted tunneling model

    Science.gov (United States)

    Kamohara, Shiro; Okuyama, Yutaka; Manabe, Yukiko; Okuyama, Kosuke; Kubota, Katsuhiko; Park, Donggun; Hu, Chenming

    1999-09-01

    We have successfully developed a new quantitative analytical ITAT-based SILC model which can explain both of the two field dependencies, i.e. Fowler-Nordheim (FN)-field and the direct tunneling (DT)-field dependent of A-mode and B-mode SILCs. While DT-field dependence of A-mode comes from the single trap assisted tunneling, FN-field dependence of B- mode originates at the tunneling via the multi-trap leakage path. We have also developed an analytical model for the anomalous SILC of the flash memory cell and investigate the properties of retention lifetime of failure bits. The anomalous SILC shows the DT-field dependence because of the tunneling via the incomplete multi-trap path. A remarkable behavior of retention characteristics predicted by our models is a nearly logarithmic time dependence. The Fowler- Nordheim tunneling model leads to an overestimation of lifetime at low Vth region. To take into account a position of each trap and clarify the detail characteristics of SILC, we have proposed a new Monte Carlo like approach for hopping conduction and successfully explained the anomalous SILC using only physical based parameters.

  9. The mathematics of cancer: integrating quantitative models.

    Science.gov (United States)

    Altrock, Philipp M; Liu, Lin L; Michor, Franziska

    2015-12-01

    Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.

  10. Comparative analysis of single-species and polybacterial wound biofilms using a quantitative, in vivo, rabbit ear model.

    Directory of Open Access Journals (Sweden)

    Akhil K Seth

    Full Text Available INTRODUCTION: The recent literature suggests that chronic wound biofilms often consist of multiple bacterial species. However, without appropriate in vivo, polybacterial biofilm models, our understanding of these complex infections remains limited. We evaluate and compare the effect of single- and mixed-species biofilm infections on host wound healing dynamics using a quantitative, in vivo, rabbit ear model. METHODS: Six-mm dermal punch wounds in New Zealand rabbit ears were inoculated with Staphylococcus aureus strain UAMS-1, Pseudomonas aeruginosa strain PAO1, or both, totaling 10/6 colony-forming units/wound. Bacterial proliferation and maintenance in vivo were done using procedures from our previously published model. Wounds were harvested for histological measurement of wound healing, viable bacterial counts using selective media, or inflammatory cytokine (IL-1β, TNF-α expression via quantitative reverse-transcription PCR. Biofilm structure was studied using scanning electron microscopy (SEM. For comparison, biofilm deficient mutant UAMS-929 replaced strain UAMS-1 in some mixed-species infections. RESULTS: Bacterial counts verified the presence of both strains UAMS-1 and PAO1 in polybacterial wounds. Over time, strain PAO1 became predominant (p<0.001. SEM showed colocalization of both species within an extracellular matrix at multiple time-points. Compared to each monospecies infection, polybacterial biofilms impaired all wound healing parameters (p<0.01, and increased expression of IL-1β and TNF-α (p<0.05. In contrast, mixed-species infections using biofilm-deficient mutant UAMS-929 instead of wild-type strain UAMS-1 showed less wound impairment (p<0.01 with decreased host cytokine expression (p<0.01, despite a bacterial burden and distribution comparable to that of mixed-wild-type wounds. CONCLUSIONS: This study reveals that mixed-species biofilms have a greater impact on wound healing dynamics than their monospecies counterparts. The

  11. Analysis of positions and substituents on genotoxicity of fluoroquinolones with quantitative structure-activity relationship and 3D Pharmacophore model.

    Science.gov (United States)

    Fengxian, Chen; Reti, Hai

    2017-02-01

    The genotoxicity values of 21 quinolones were studied to establish a quantitative structure-activity relationship model and 3D Pharmacophore model separately for screening essential positions and substituents that contribute to genotoxicity of fluoroquinolones (FQs). A full factor experimental design was performed to analyze the specific main effect and second-order interaction effect of different positions and substituents on genotoxicity, forming a reasonable modification scheme which was validated on typical FQ with genotoxicity and efficacy data. Four positions (1, 5, 7, 8) were screened finally to form the full factorial experimental design which contained 72 congeners in total, illustrating that: the dominant effect of 5 and 7-positions on genotoxicity of FQs is main effect; meanwhile the effect of 1 and 8-positions is a second-order interaction effect; two adjacent positions always have stronger second-order interaction effect and lower genotoxicity; the obtained modification scheme had been validated on typical FQ congeners with the modified compound has a lower genotoxicity, higher synthesis feasibilities and efficacy. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. MR brain image analysis in dementia: From quantitative imaging biomarkers to ageing brain models and imaging genetics.

    Science.gov (United States)

    Niessen, Wiro J

    2016-10-01

    MR brain image analysis has constantly been a hot topic research area in medical image analysis over the past two decades. In this article, it is discussed how the field developed from the construction of tools for automatic quantification of brain morphology, function, connectivity and pathology, to creating models of the ageing brain in normal ageing and disease, and tools for integrated analysis of imaging and genetic data. The current and future role of the field in improved understanding of the development of neurodegenerative disease is discussed, and its potential for aiding in early and differential diagnosis and prognosis of different types of dementia. For the latter, the use of reference imaging data and reference models derived from large clinical and population imaging studies, and the application of machine learning techniques on these reference data, are expected to play a key role. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Statistical analysis of results from the quantitative mapping of fracture minerals in Laxemar. Site descriptive modelling - complementary studies

    Energy Technology Data Exchange (ETDEWEB)

    Loefgren, Martin (Niressa AB, Norsborg (Sweden)); Sidborn, Magnus (Kemakta Konsult AB, Stockholm (Sweden))

    2010-12-15

    Within the Laxemar site investigation campaign, quantitative mapping of different fracture minerals has been performed. This has been done by studying fracture surfaces of drill core sections from many different boreholes at the Laxemar site /Eklund and Mattsson 2008/. The drill core mapping was focused on the rock in the vicinity of flow anomalies detected by the Posiva Flow Log (PFL). The quantitative mapping was performed only on open fractures. The fracture minerals that were mapped are calcite, chlorite, clay minerals (as a group), hematite, and pyrite. In this present report, data from the quantitative mineral mapping campaign are refined, sorted into different data subsets, and analysed by parametric and non-parametric statistical methods. The data subsets are associated with 17 different rock volumes, representing different elevations, rock domains, fracture domains, and groups of deformation zones. In total 1,852 fractures were mapped at the site, and the most frequent mineral was calcite. Its amount could be quantitatively estimated in 51% of the mapped fractures. Of the other minerals, chlorite was quantitatively estimated in 46%, pyrite in 19%, clay minerals in 16%, and hematite in 0.05% of the mapped fractures. For fractures where the averaged fracture mineral thickness, d{sub mean} [mm], and visible coverage, C{sub vis} [%], could be quantitatively estimated, the following arithmetic means were found: calcite = 0.25 mm and 22%, chlorite = 0.29 mm and 41%, pyrite =1.3 mum and 0.2%, and clay minerals = 0.15 mm and 35%. These quantities are based on visual inspection of fracture surfaces and do not include the contribution from non-consolidated fracture fillings. It is shown that there is significant spatial variability of d{sub mean} and C{sub vis} within the examined rock volumes. Furthermore, the non-parametric analyses indicate that there are differences in d{sub mean} and C{sub vis} between the different rock volumes. Even so, the differences are

  14. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye mov

  15. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye

  16. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  17. Quantitative Analysis of Face Symmetry.

    Science.gov (United States)

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  18. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  19. Quantitative analysis of Boehm's GC

    Institute of Scientific and Technical Information of China (English)

    GUAN Xue-tao; ZHANG Yuan-rui; GOU Xiao-gang; CHENG Xu

    2003-01-01

    The term garbage collection describes the automated process of finding previously allocated memorythatis no longer in use in order to make the memory available to satisfy subsequent allocation requests. Wehave reviewed existing papers and implementations of GC, and especially analyzed Boehm' s C codes, which isa real-time mark-sweep GC running under Linux and ANSI C standard. In this paper, we will quantitatively an-alyze the performance of different configurations of Boehm' s collector subjected to different workloads. Reportedmeasurements demonstrate that a refined garbage collector is a viable alternative to traditional explicit memorymanagement techniques, even for low-level languages. It is more a trade-off for certain system than an all-or-nothing proposition.

  20. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  1. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  2. A practical approach for near infrared spectral quantitative analysis of complex samples using partial least squares modeling

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The number of latent variables (LVs) or the factor number is a key parameter in PLS modeling to obtain a correct prediction. Although lots of work have been done on this issue, it is still a difficult task to determine a suitable LV number in practical uses. A method named independent factor diagnostics (IFD) is proposed for investigation of the contribution of each LV to the predicted results on the basis of discussion about the determination of LV number in PLS modeling for near infrared (NIR) spectra of complex samples. The NIR spectra of three data sets of complex samples, including a public data set and two tobacco lamina ones, are investigated. It is shown that several high order LVs constitute main contributions to the predicted results, albeit the contribution of the low order LVs should not be neglected in the PLS models. Therefore, in practical uses of PLS for analysis of complex samples, it may be better to use a slightly large LV number for NIR spectral analysis of complex samples.

  3. A practical approach for near infrared spectral quantitative analysis of complex samples using partial least squares modeling

    Institute of Scientific and Technical Information of China (English)

    LIU ZhiChao; MA Xiang; WEN YaDong; WANG Yi; CAI WenSheng; SHAO XueGuang

    2009-01-01

    The number of latent variables (LVs) or the factor number is a key parameter in PLS modeling to obtain a correct prediction.Although lots of work have been done on this issue,it is still a difficult task to determine a suitable LV number in practical uses.A method named independent factor diagnostics (IFD) is proposed for investigation of the contribution of each LV to the predicted results on the basis of discussion about the determination of LV number in PLS modeling for near infrared (NIR) spectra of complex samples.The NIR spectra of three data sets of complex samples,including a public data set and two tobacco lamina ones,are investigated.It is shown that several high order LVs constitute main contributions to the predicted results,albeit the contribution of the low order LVs should not be neglected in the PLS models.Therefore,in practical uses of PLS for analysis of complex samples,it may be better to use a slightly large LV number for NIR spectral analysis of complex samples.

  4. Countries' climate mitigation commitments under the 'South-North Dialogue' Proposal. A quantitative analysis using the FAIR 2.1 world model

    Energy Technology Data Exchange (ETDEWEB)

    Den Elzen, M.G.J.

    2005-07-01

    The 'South-North Dialogue Proposal', developed by researchers from both developing and industrialised countries, outlines an approach for an 'equitable' differentiation of future climate mitigation commitments among developed and developing countries. This approach is based on the criteria of responsibility, capability and potential to mitigate. The report provides a quantitative analysis of the implications of the proposal in terms of countries' commitments and costs. The analysis focuses on a 'political willingness' scenario and on four scenarios leading to the stabilisation of CO2-equivalent concentrations at 400, 450, 500 and 550 ppm, respectively. Use is made of the new FAIR 2.1 world model, i.e. the FAIR 2.1 model at the level of countries, using as input data for population, GDP and emissions from emission scenarios at the national level. The analysis shows that for the stringent stabilisation targets many developing countries will have to take on quantitative mitigation obligations by 2030, even when the Annex I countries adopt ambitious mitigation commitments far beyond the Kyoto obligations. The political willingness scenario will probably not suffice to limit warming of the earth's atmosphere to a level under 20C.

  5. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  6. Quantitative structure - mesothelioma potency model ...

    Science.gov (United States)

    Cancer potencies of mineral and synthetic elongated particle (EP) mixtures, including asbestos fibers, are influenced by changes in fiber dose composition, bioavailability, and biodurability in combination with relevant cytotoxic dose-response relationships. A unique and comprehensive rat intra-pleural (IP) dose characterization data set with a wide variety of EP size, shape, crystallographic, chemical, and bio-durability properties facilitated extensive statistical analyses of 50 rat IP exposure test results for evaluation of alternative dose pleural mesothelioma response models. Utilizing logistic regression, maximum likelihood evaluations of thousands of alternative dose metrics based on hundreds of individual EP dimensional variations within each test sample, four major findings emerged: (1) data for simulations of short-term EP dose changes in vivo (mild acid leaching) provide superior predictions of tumor incidence compared to non-acid leached data; (2) sum of the EP surface areas (ÓSA) from these mildly acid-leached samples provides the optimum holistic dose response model; (3) progressive removal of dose associated with very short and/or thin EPs significantly degrades resultant ÓEP or ÓSA dose-based predictive model fits, as judged by Akaike’s Information Criterion (AIC); and (4) alternative, biologically plausible model adjustments provide evidence for reduced potency of EPs with length/width (aspect) ratios 80 µm. Regar

  7. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    on the existence of a quotient construction, allowing a property phi of a parallel system phi/A to be transformed into a sufficient and necessary quotient-property yolA to be satisfied by the component 13. Given a model checking problem involving a network Pi I and a property yo, the method gradually move (by...

  8. Quantitative analysis of disc degeneration using axial T2 mapping in a percutaneous annular puncture model in rabbits

    Energy Technology Data Exchange (ETDEWEB)

    Chai, Jee Won; Kim, Su Jin [Dept. of Radiology, SMG-SNU Boramae Medical Center, Seoul (Korea, Republic of); Kang, Heung Sik; Lee, Joon Woo [Dept. of Radiology, Seoul National University Bundang Hospital, Seongnam (Korea, Republic of); Hong, Sung Hwan [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-02-15

    To evaluate T2 relaxation time change using axial T2 mapping in a rabbit degenerated disc model and determine the most correlated variable with histologic score among T2 relaxation time, disc height index, and Pfirrmann grade. Degenerated disc model was made in 4 lumbar discs of 11 rabbits (n = 44) by percutaneous annular puncture with various severities of an injury. Lumbar spine lateral radiograph, MR T2 sagittal scan and MR axial T2 mapping were obtained at baseline and 2 weeks and 4 weeks after the injury in 7 rabbits and at baseline and 2 weeks, 4 weeks, and 6 weeks after the injury in 4 rabbits. Generalized estimating equations were used for a longitudinal analysis of changes in T2 relaxation time in degenerated disc model. T2 relaxation time, disc height index and Pfirrmann grade were correlated with the histologic scoring of disc degeneration using Spearman's rho test. There was a significant difference in T2 relaxation time between uninjured and injured discs after annular puncture. Progressive decrease in T2 relaxation time was observed in injured discs throughout the study period. Lower T2 relaxation time was observed in the more severely injured discs. T2 relaxation time showed the strongest inverse correlation with the histologic score among the variables investigated (r = -0.811, p < 0.001). T2 relaxation time measured with axial T2 mapping in degenerated discs is a potential method to assess disc degeneration.

  9. Quantitative histogram analysis of images

    Science.gov (United States)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  10. Nonlinear dynamics and quantitative EEG analysis.

    Science.gov (United States)

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  11. Possibilities of 3-D modelling and quantitative morphometric analysis of decimeter-sized Echinoids using photogrammetric approach

    Science.gov (United States)

    Polonkai, Bálint; Görög, Ágnes; Raveloson, Andrea; Bodor, Emese; Székely, Balázs

    2017-04-01

    Echinoids (sea urchins) are useful fossils in palaeoenvironmental reconstruction for e.g. palaeobiogeography, palaeoclimatology or sedimentatological researches. In the Hungarian Badenian stage (Langhian, Middle Miocene) the species Parascutella gibbercula (DE SERRES 1829) is a common taxon and indicate shallow marine environment. The specimens of this extinct species show high morphological variability within relatively small geographical areas, even within one given strata. These differences can have a relevant palaeontological and/or palaeoenvironmental information. It is necessary for the interpretation of the value of the morphological parameters to quantify them in properties. Among the possible quantification methods 3D photogrammetric reconstruction is found to be suitable; recent years have seen its increasing palaeontological application both on invertebrates and vertebrates. In order to generate proper 3D models of the specimens with the required details a great number of digital images have to be shot. In case of proper data acquisition and precise model generation it is possible to outperform the traditional 2D morphometric studies of the echinoids that are often inaccurate when the spatial characters as well as ambulacral system and the conical shaped apex (top of the test) are measured. An average P . gibbercula specimen is about 10 cm diameter. Therefore, desktop image acquisition is possible if appropriate lighting conditions are provided. For better results we have designed an elaborate target background pattern that enhances the chances to find homologous points in the imagery. Agisoft Photoscan software has been used for the model generation. The generated models typically show high-resolution details and reproduce original colours. However, various problems may occur: improper focusing and/or poor lighting conditions may cause hardly patchable aboral and oral side, and/or shallow surface undulations cannot be modelled appropriately. Another

  12. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  13. Quantitative RNA-Seq analysis in non-model species: assessing transcriptome assemblies as a scaffold and the utility of evolutionary divergent genomic reference species

    Directory of Open Access Journals (Sweden)

    Hornett Emily A

    2012-08-01

    Full Text Available Abstract Background How well does RNA-Seq data perform for quantitative whole gene expression analysis in the absence of a genome? This is one unanswered question facing the rapidly growing number of researchers studying non-model species. Using Homo sapiens data and resources, we compared the direct mapping of sequencing reads to predicted genes from the genome with mapping to de novo transcriptomes assembled from RNA-Seq data. Gene coverage and expression analysis was further investigated in the non-model context by using increasingly divergent genomic reference species to group assembled contigs by unique genes. Results Eight transcriptome sets, composed of varying amounts of Illumina and 454 data, were assembled and assessed. Hybrid 454/Illumina assemblies had the highest transcriptome and individual gene coverage. Quantitative whole gene expression levels were highly similar between using a de novo hybrid assembly and the predicted genes as a scaffold, although mapping to the de novo transcriptome assembly provided data on fewer genes. Using non-target species as reference scaffolds does result in some loss of sequence and expression data, and bias and error increase with evolutionary distance. However, within a 100 million year window these effect sizes are relatively small. Conclusions Predicted gene sets from sequenced genomes of related species can provide a powerful method for grouping RNA-Seq reads and annotating contigs. Gene expression results can be produced that are similar to results obtained using gene models derived from a high quality genome, though biased towards conserved genes. Our results demonstrate the power and limitations of conducting RNA-Seq in non-model species.

  14. [Spectral quantitative analysis by nonlinear partial least squares based on neural network internal model for flue gas of thermal power plant].

    Science.gov (United States)

    Cao, Hui; Li, Yao-Jiang; Zhou, Yan; Wang, Yan-Xia

    2014-11-01

    To deal with nonlinear characteristics of spectra data for the thermal power plant flue, a nonlinear partial least square (PLS) analysis method with internal model based on neural network is adopted in the paper. The latent variables of the independent variables and the dependent variables are extracted by PLS regression firstly, and then they are used as the inputs and outputs of neural network respectively to build the nonlinear internal model by train process. For spectra data of flue gases of the thermal power plant, PLS, the nonlinear PLS with the internal model of back propagation neural network (BP-NPLS), the non-linear PLS with the internal model of radial basis function neural network (RBF-NPLS) and the nonlinear PLS with the internal model of adaptive fuzzy inference system (ANFIS-NPLS) are compared. The root mean square error of prediction (RMSEP) of sulfur dioxide of BP-NPLS, RBF-NPLS and ANFIS-NPLS are reduced by 16.96%, 16.60% and 19.55% than that of PLS, respectively. The RMSEP of nitric oxide of BP-NPLS, RBF-NPLS and ANFIS-NPLS are reduced by 8.60%, 8.47% and 10.09% than that of PLS, respectively. The RMSEP of nitrogen dioxide of BP-NPLS, RBF-NPLS and ANFIS-NPLS are reduced by 2.11%, 3.91% and 3.97% than that of PLS, respectively. Experimental results show that the nonlinear PLS is more suitable for the quantitative analysis of glue gas than PLS. Moreover, by using neural network function which can realize high approximation of nonlinear characteristics, the nonlinear partial least squares method with internal model mentioned in this paper have well predictive capabilities and robustness, and could deal with the limitations of nonlinear partial least squares method with other internal model such as polynomial and spline functions themselves under a certain extent. ANFIS-NPLS has the best performance with the internal model of adaptive fuzzy inference system having ability to learn more and reduce the residuals effectively. Hence, ANFIS-NPLS is an

  15. Analysis of Forecasting Sales By Using Quantitative And Qualitative Methods

    Directory of Open Access Journals (Sweden)

    B. Rama Sanjeeva Sresta,

    2016-09-01

    Full Text Available This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives.

  16. Quantitative analysis of protein turnover in plants.

    Science.gov (United States)

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  17. Quantitative analysis of cardiac tissue including fibroblasts using three-dimensional confocal microscopy and image reconstruction: towards a basis for electrophysiological modeling

    NARCIS (Netherlands)

    Schwab, Bettina C.; Seemann, Gunnar; Lasher, Richard A.; Torres, Natalia S.; Wülfers, Eike M.; Arp, Maren; Carruth, Eric D.; Bridge, John H.B.; Sachse, Frank B.

    2013-01-01

    Electrophysiological modeling of cardiac tissue is commonly based on functional and structural properties measured in experiments. Our knowledge of these properties is incomplete, in particular their remodeling in disease. Here, we introduce a methodology for quantitative tissue characterization bas

  18. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  19. Exhumation of the North Alpine Foreland Basin- Quantitative insights from structural analysis, thermochronology and a new thermal history model

    Science.gov (United States)

    Luijendijk, Elco; von Hagke, Christoph; Hindle, David

    2016-04-01

    Due to a wealth of geological and thermochronology data the northern foreland basin of the European Alps is an ideal natural laboratory for understanding the dynamics of foreland basins and their interaction with surface and geodynamic processes. We present an unprecedented compilation of thermochronological data from the basin and quantify cooling and exhumation rates in the basin by combining published and new vitrinite reflectance, apatite fission track and U-Th/He data with a new inverse burial and thermal history model. No correlation is obvious between inferred cooling and exhumation rates and elevation, relief or tectonics. We compare derived temperature histories to exhumation estimates based on the retro-deformation of Molasse basin and the Jura mountains, and to exhumation caused by drainage reorganization and incision. Drainage reorganization can explain at most 25% of the observed cooling rates in the basin. Tectonic transport of the basin's sediments over the inclined basement of the alpine foreland as the Jura mountains shortened can explain part of the cooling signal in the western part of the basin. However, overall a substantial amount of cooling and exhumation remains unexplained by known tectonic and surface processes. Our results document basin wide exhumation that may be related to slab roll-back or other lithospheric processes. Uncertainty analysis shows that thermochronometers can be explained by cooling and exhumation starting as early as the Miocene or as late as the Pleistocene. New (U-Th)/He data from key areas close to the Alpine front may provide better constraints on the timing of exhumation.

  20. Christhin: Quantitative Analysis of Thin Layer Chromatography

    CERN Document Server

    Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda

    2012-01-01

    Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

  1. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  2. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  3. Quantitative analysis of arm movement smoothness

    Science.gov (United States)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  4. Seniors' Online Communities: A Quantitative Content Analysis

    Science.gov (United States)

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  5. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  6. 一种新的知识流动量化分析模型%A New Quantitative Analysis Model of Knowledge Flows

    Institute of Scientific and Technical Information of China (English)

    孙浩

    2011-01-01

    This paper based on the basic definition of knowledge puts forward a series of assumptions, open and private knowledge flow theory and quantitative analysis model of knowledge flows. The paper raises the viewpoint that development desire and knowledge profit are the original impetus to the open and private knowledge flows. Compared with the reference, this paper presents a new perspective of knowledge flow, which is also the advantage of this paper.%根据知识的基本定义提出一系列假设、开放型与私有型知识流动理论和知识流动量化分析模型,认为发展欲望和知识利润是开放型与私有型知识流动的最主要的原始推动力.

  7. Augmented multivariate image analysis applied to quantitative structure-activity relationship modeling of the phytotoxicities of benzoxazinone herbicides and related compounds on problematic weeds.

    Science.gov (United States)

    Freitas, Mirlaine R; Matias, Stella V B G; Macedo, Renato L G; Freitas, Matheus P; Venturin, Nelson

    2013-09-11

    Two of major weeds affecting cereal crops worldwide are Avena fatua L. (wild oat) and Lolium rigidum Gaud. (rigid ryegrass). Thus, development of new herbicides against these weeds is required; in line with this, benzoxazinones, their degradation products, and analogues have been shown to be important allelochemicals and natural herbicides. Despite earlier structure-activity studies demonstrating that hydrophobicity (log P) of aminophenoxazines correlates to phytotoxicity, our findings for a series of benzoxazinone derivatives do not show any relationship between phytotoxicity and log P nor with other two usual molecular descriptors. On the other hand, a quantitative structure-activity relationship (QSAR) analysis based on molecular graphs representing structural shape, atomic sizes, and colors to encode other atomic properties performed very accurately for the prediction of phytotoxicities of these compounds against wild oat and rigid ryegrass. Therefore, these QSAR models can be used to estimate the phytotoxicity of new congeners of benzoxazinone herbicides toward A. fatua L. and L. rigidum Gaud.

  8. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  9. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  10. Quantitative risks analysis of maritime terminal petrochemical

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Leandro Silveira; Leal, Cesar A. [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica (PROMEC)]. E-mail: leandro19889900@yahoo.com.br

    2008-07-01

    This work consists of the application of a computer program (RISKAN) developed for studies of quantification of industrial risks and also a revision of the models used in the program. As part the evaluation made, a test was performed with the application of the computer program to estimate the risks for a marine terminal for storage of petrochemical products, in the city of Rio Grande, Brazil. Thus, as part of the work, it was performed a Quantitative Risk Analysis associated to the terminal, both for the workers and for the population nearby, with a verification of acceptability using the tolerability limits established by the State Licensing Agency (FEPAM-RS). In the risk analysis methodology used internationally, the most used way of presenting results of social risks is in the graphical form with the use of the FN curves and for the individual risk it is common the use of the iso-risk curves traced on the map of the area where is the plant. In the beginning of the study, both a historical analysis of accidents and use of the technique of Preliminary Analysis of Risks were made in order to aid in the process of identification of the possible scenarios of accidents related to the activities in the terminal. After identifying the initiating events, their frequencies or probabilities of occurrence were estimated and followed by the calculations of the physical effects and deaths, with the use, inside the computer program, of published models of Prins Mauritz Laboratory and of American Institute of Chemical Engineers. The average social risk obtained for the external populations was of 8.7x10{sup -7} fatality.year{sup -1} and for the internal population (people working inside the terminal), 3.2x10{sup -4} fatality.year-1. The accident scenario that most contributed to the social risk was death due to exposure to the thermal radiation caused by pool fire, with 84.3% of the total estimated for external populations and 82.9% for the people inside the terminal. The

  11. Quantitative model validation techniques: new insights

    CERN Document Server

    Ling, You

    2012-01-01

    This paper develops new insights into quantitative methods for the validation of computational model prediction. Four types of methods are investigated, namely classical and Bayesian hypothesis testing, a reliability-based method, and an area metric-based method. Traditional Bayesian hypothesis testing is extended based on interval hypotheses on distribution parameters and equality hypotheses on probability distributions, in order to validate models with deterministic/stochastic output for given inputs. Two types of validation experiments are considered - fully characterized (all the model/experimental inputs are measured and reported as point values) and partially characterized (some of the model/experimental inputs are not measured or are reported as intervals). Bayesian hypothesis testing can minimize the risk in model selection by properly choosing the model acceptance threshold, and its results can be used in model averaging to avoid Type I/II errors. It is shown that Bayesian interval hypothesis testing...

  12. Quantitative image analysis of celiac disease.

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-07

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  13. Quantitative image analysis of celiac disease

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  14. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources ... 350 km long extends from the eastern border of Sierra Leone all the way to. Ghana. .... consider whether data will likely fit the assumptions of a selected model. ... These tests are not alternatives to parametric tests, but rather are a means of.

  15. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  16. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  17. The Stability of Currency Systems in East Asia --Quantitative Analysis Using a Multi-Country Macro-Econometric Model--

    OpenAIRE

    Koichiro Kamada

    2009-01-01

    The purpose of this paper is to examine the stability of East Asian financial and currency systems, using the multi-country macro-econometric model constructed by Kamada and Takagawa (2005) to depict economic interdependence in the Asian-Pacific region. The highly-developed system of the international production network in the East Asian region was not only the driving force behind the "East Asian miracle," but also, as seen in the "Asian currency crisis," worked as a platform whereby local e...

  18. Quantitative genetic analysis of brain size variation in sticklebacks: support for the mosaic model of brain evolution.

    Science.gov (United States)

    Noreikiene, Kristina; Herczeg, Gábor; Gonda, Abigél; Balázs, Gergely; Husby, Arild; Merilä, Juha

    2015-07-07

    The mosaic model of brain evolution postulates that different brain regions are relatively free to evolve independently from each other. Such independent evolution is possible only if genetic correlations among the different brain regions are less than unity. We estimated heritabilities, evolvabilities and genetic correlations of relative size of the brain, and its different regions in the three-spined stickleback (Gasterosteus aculeatus). We found that heritabilities were low (average h(2) = 0.24), suggesting a large plastic component to brain architecture. However, evolvabilities of different brain parts were moderate, suggesting the presence of additive genetic variance to sustain a response to selection in the long term. Genetic correlations among different brain regions were low (average rG = 0.40) and significantly less than unity. These results, along with those from analyses of phenotypic and genetic integration, indicate a high degree of independence between different brain regions, suggesting that responses to selection are unlikely to be severely constrained by genetic and phenotypic correlations. Hence, the results give strong support for the mosaic model of brain evolution. However, the genetic correlation between brain and body size was high (rG = 0.89), suggesting a constraint for independent evolution of brain and body size in sticklebacks.

  19. Quantitative Results and Comparative Analysis of Producer Models%生产者模型的求解与比较

    Institute of Scientific and Technical Information of China (English)

    李洪心

    2005-01-01

    基于一般均衡理论的CGE(Computable General Equilibrium)模型的应用是现代国际经济理论的一个重要的新发展,它由描述供给、需求和供求平衡的三组方程组成.本文利用GAMS(General Algebraic Modeling System)仿真软件,对不同形式的生产者行为模型进行定量求解与分析,并比较CES 生产函数和Cobb Douglas生产函数之间的共性与差异,以便为不同经济问题采用不同形式的供给方程组提供科学定量的理论依据.

  20. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics.

  1. Spherical blurred shape model for 3-D object and pose recognition: quantitative analysis and HCI applications in smart environments.

    Science.gov (United States)

    Lopes, Oscar; Reyes, Miguel; Escalera, Sergio; Gonzàlez, Jordi

    2014-12-01

    The use of depth maps is of increasing interest after the advent of cheap multisensor devices based on structured light, such as Kinect. In this context, there is a strong need of powerful 3-D shape descriptors able to generate rich object representations. Although several 3-D descriptors have been already proposed in the literature, the research of discriminative and computationally efficient descriptors is still an open issue. In this paper, we propose a novel point cloud descriptor called spherical blurred shape model (SBSM) that successfully encodes the structure density and local variabilities of an object based on shape voxel distances and a neighborhood propagation strategy. The proposed SBSM is proven to be rotation and scale invariant, robust to noise and occlusions, highly discriminative for multiple categories of complex objects like the human hand, and computationally efficient since the SBSM complexity is linear to the number of object voxels. Experimental evaluation in public depth multiclass object data, 3-D facial expressions data, and a novel hand poses data sets show significant performance improvements in relation to state-of-the-art approaches. Moreover, the effectiveness of the proposal is also proved for object spotting in 3-D scenes and for real-time automatic hand pose recognition in human computer interaction scenarios.

  2. Refining the quantitative pathway of the Pathways to Mathematics model.

    Science.gov (United States)

    Sowinski, Carla; LeFevre, Jo-Anne; Skwarchuk, Sheri-Lynn; Kamawar, Deepthi; Bisanz, Jeffrey; Smith-Chant, Brenda

    2015-03-01

    In the current study, we adopted the Pathways to Mathematics model of LeFevre et al. (2010). In this model, there are three cognitive domains--labeled as the quantitative, linguistic, and working memory pathways--that make unique contributions to children's mathematical development. We attempted to refine the quantitative pathway by combining children's (N=141 in Grades 2 and 3) subitizing, counting, and symbolic magnitude comparison skills using principal components analysis. The quantitative pathway was examined in relation to dependent numerical measures (backward counting, arithmetic fluency, calculation, and number system knowledge) and a dependent reading measure, while simultaneously accounting for linguistic and working memory skills. Analyses controlled for processing speed, parental education, and gender. We hypothesized that the quantitative, linguistic, and working memory pathways would account for unique variance in the numerical outcomes; this was the case for backward counting and arithmetic fluency. However, only the quantitative and linguistic pathways (not working memory) accounted for unique variance in calculation and number system knowledge. Not surprisingly, only the linguistic pathway accounted for unique variance in the reading measure. These findings suggest that the relative contributions of quantitative, linguistic, and working memory skills vary depending on the specific cognitive task. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    Science.gov (United States)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  4. Genomewide rapid association using mixed model and regression: A fast and simple method for genomewide pedigree-based quantitative trait loci association analysis

    NARCIS (Netherlands)

    Y.S. Aulchenko (Yurii); D.-J. de Koning; C. Haley (Chris)

    2007-01-01

    textabstractFor pedigree-based quantitative trait loci (QTL) association analysis, a range of methods utilizing within-family variation such as transmission- disequilibrium test (TDT)-based methods have been developed. In scenarios where stratification is not a concern, methods exploiting between-fa

  5. iTRAQ-based quantitative analysis of hippocampal postsynaptic density-associated proteins in a rat chronic mild stress model of depression.

    Science.gov (United States)

    Han, X; Shao, W; Liu, Z; Fan, S; Yu, J; Chen, J; Qiao, R; Zhou, J; Xie, P

    2015-07-09

    Major depressive disorder (MDD) is a prevalent psychiatric mood illness and a major cause of disability and suicide worldwide. However, the underlying pathophysiology of MDD remains poorly understood due to its heterogenic nature. Extensive pre-clinical research suggests that many molecular alterations associated with MDD preferentially localize to the postsynaptic density (PSD). Here, we used a rodent chronic mild stress (CMS) model to generate susceptible and unsusceptible subpopulations. Proteomic analysis using an isobaric tag for relative and absolute quantitation (iTRAQ) and tandem mass spectrometry was performed to identify differentially expressed proteins in enriched PSD preparations from the hippocampi of different groups. More than 1500 proteins were identified and quantified, and 74 membrane proteins were differentially expressed. Of these membrane proteins, 51 (69%) were identified by SynaptomeDB search as having a predicted PSD localization. The unbiased profiles identified several PSD candidate proteins that may be related to CMS vulnerability or insusceptibility, and these two CMS phenotypes displayed differences in the abundance of several types of proteins. A detailed protein functional analysis pointed to a role for PSD-associated proteins involved in signaling and regulatory functions. Within the PSD, the N-methyl-D-aspartate (NMDA) receptor subunit NR2A and its downstream targets contribute to CMS susceptibility. Further analysis of disease relevance indicated that the PSD contains a complex set of proteins of known relevance to mental illnesses including depression. In sum, these findings provide novel insights into the contribution of PSD-associated proteins to stress susceptibility and further advance our understanding of the role of hippocampal synaptic plasticity in MDD.

  6. Segregation Analysis on Genetic System of Quantitative Traits in Plants

    Institute of Scientific and Technical Information of China (English)

    Gai Junyi

    2006-01-01

    Based on the traditional polygene inheritance model of quantitative traits,the author suggests the major gene and polygene mixed inheritance model.The model was considered as a general one,while the pure major gene and pure polygene inheritance model was a specific case of the general model.Based on the proposed theory,the author established the segregation analysis procedure to study the genetic system of quantitative traits of plants.At present,this procedure can be used to evaluate the genetic effect of individual major genes (up to two to three major genes),the collective genetic effect of polygene,and their heritability value.This paper introduces how to establish the procedure,its main achievements,and its applications.An example is given to illustrate the steps,methods,and effectiveness of the procedure.

  7. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  8. Conditions for sample preparation and quantitative HPLC/MS-MS analysis of bulky adducts to serum albumin with diolepoxides of polycyclic aromatic hydrocarbons as models.

    Science.gov (United States)

    Westberg, Emelie; Hedebrant, Ulla; Haglund, Johanna; Alsberg, Tomas; Eriksson, Johan; Seidel, Albrecht; Törnqvist, Margareta

    2014-02-01

    Stable adducts to serum albumin (SA) from electrophilic and genotoxic compounds/metabolites can be used as biomarkers for quantification of the corresponding in vivo dose. In the present study, conditions for specific analysis of stable adducts to SA formed from carcinogenic polycyclic aromatic hydrocarbons (PAH) were evaluated in order to achieve a sensitive and reproducible quantitative method. Bulky adducts from diolepoxides (DE) of PAH, primarily DE of benzo[a]pyrene (BPDE) and also DE of dibenzo[a,l]pyrene (DBPDE) and dibenzo[a,h]anthracene (DBADE), were used as model compounds. The alkylated peptides obtained after enzymatic hydrolysis of human SA modified with the different PAHDE were principally PAHDE-His-Pro, PAHDE-His-Pro-Tyr and PAHDE-Lys. Alkaline hydrolysis under optimised conditions gave the BPDE-His as the single analyte of alkylated His, but also indicated degradation of this adduct. It was not possible to obtain the BPDE-His as one analyte from BPDE-alkylated SA through modifications of the enzymatic hydrolysis. The BPDE-His adduct was shown to be stable during the weak acidic conditions used in the isolation of SA. Enrichment by HPLC or SPE, but not butanol extraction, gave good recovery, using Protein LoBind tubes. A simple internal standard (IS) approach using SA modified with other PAHDE as IS was shown to be applicable. A robust analytical procedure based on digestion with pronase, enrichment by HPLC or SPE, and analysis with HPLC/MS-MS electrospray ionisation was achieved. A good reproducibility (coefficient of variation (CV) 11 %) was obtained, and the achieved limit of detection for the studied PAHDE, using standard instrumentation, was approximately 1 fmol adduct/mg SA analysing extract from 5 mg SA.

  9. Quantitative analysis of spirality in elliptical galaxies

    CERN Document Server

    Dojcsak, Levente

    2013-01-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  10. Quantitative laryngeal electromyography: turns and amplitude analysis.

    Science.gov (United States)

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  11. Quantitative bioluminescence imaging of mouse tumor models.

    Science.gov (United States)

    Tseng, Jen-Chieh; Kung, Andrew L

    2015-01-05

    Bioluminescence imaging (BLI) has become an essential technique for preclinical evaluation of anticancer therapeutics and provides sensitive and quantitative measurements of tumor burden in experimental cancer models. For light generation, a vector encoding firefly luciferase is introduced into human cancer cells that are grown as tumor xenografts in immunocompromised hosts, and the enzyme substrate luciferin is injected into the host. Alternatively, the reporter gene can be expressed in genetically engineered mouse models to determine the onset and progression of disease. In addition to expression of an ectopic luciferase enzyme, bioluminescence requires oxygen and ATP, thus only viable luciferase-expressing cells or tissues are capable of producing bioluminescence signals. Here, we summarize a BLI protocol that takes advantage of advances in hardware, especially the cooled charge-coupled device camera, to enable detection of bioluminescence in living animals with high sensitivity and a large dynamic range.

  12. Quantitative assessment model for gastric cancer screening

    Institute of Scientific and Technical Information of China (English)

    Kun Chen; Wei-Ping Yu; Liang Song; Yi-Min Zhu

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer.METHODS: A case control study was carried on in 66patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food,etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD).RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively.According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%.Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05).CONCLUSION: The validity of this method is satisfactory.It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer.

  13. Global Quantitative Modeling of Chromatin Factor Interactions

    Science.gov (United States)

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  14. Multiple quantitative trait analysis using bayesian networks.

    Science.gov (United States)

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness.

  15. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  16. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  17. Quantitative versus qualitative modeling: a complementary approach in ecosystem study.

    Science.gov (United States)

    Bondavalli, C; Favilla, S; Bodini, A

    2009-02-01

    Natural disturbance or human perturbation act upon ecosystems by changing some dynamical parameters of one or more species. Foreseeing these modifications is necessary before embarking on an intervention: predictions may help to assess management options and define hypothesis for interventions. Models become valuable tools for studying and making predictions only when they capture types of interactions and their magnitude. Quantitative models are more precise and specific about a system, but require a large effort in model construction. Because of this very often ecological systems remain only partially specified and one possible approach to their description and analysis comes from qualitative modelling. Qualitative models yield predictions as directions of change in species abundance but in complex systems these predictions are often ambiguous, being the result of opposite actions exerted on the same species by way of multiple pathways of interactions. Again, to avoid such ambiguities one needs to know the intensity of all links in the system. One way to make link magnitude explicit in a way that can be used in qualitative analysis is described in this paper and takes advantage of another type of ecosystem representation: ecological flow networks. These flow diagrams contain the structure, the relative position and the connections between the components of a system, and the quantity of matter flowing along every connection. In this paper it is shown how these ecological flow networks can be used to produce a quantitative model similar to the qualitative counterpart. Analyzed through the apparatus of loop analysis this quantitative model yields predictions that are by no means ambiguous, solving in an elegant way the basic problem of qualitative analysis. The approach adopted in this work is still preliminary and we must be careful in its application.

  18. A quantitative comparison of Calvin-Benson cycle models.

    Science.gov (United States)

    Arnold, Anne; Nikoloski, Zoran

    2011-12-01

    The Calvin-Benson cycle (CBC) provides the precursors for biomass synthesis necessary for plant growth. The dynamic behavior and yield of the CBC depend on the environmental conditions and regulation of the cellular state. Accurate quantitative models hold the promise of identifying the key determinants of the tightly regulated CBC function and their effects on the responses in future climates. We provide an integrative analysis of the largest compendium of existing models for photosynthetic processes. Based on the proposed ranking, our framework facilitates the discovery of best-performing models with regard to metabolomics data and of candidates for metabolic engineering.

  19. Quantitative Analysis of Yu Ebao Based on Linear Regression Model%基于线性回归模型的余额宝价值分析

    Institute of Scientific and Technical Information of China (English)

    刘冬青

    2014-01-01

    余额宝以其较低的门槛让更多的人接触到货币基金。通过介绍余额宝的主体框架以及2013年余额宝的收益和费用情况,建立线性回归模型,定量分析了余额宝的收益。经过研究发现:可以怀疑在高额利益的背后余额宝公司可能存在前期的贴息问题以吸引客户;余额宝通过协议存款将利润从银行转给客户的同时,并且在一定程度上也把风险转嫁给了客户;余额宝给金融行业带来了革命性的创新是不可否认的,同时也带来了潜在的系统性风险。%Yu Ebao,with its low threshold, let more people access to the monetary fund.Through detailed introduction of the main frame of Yu Ebao ,and the income and expenses of Yu Ebao.Establish a linear regression model,to analysis the Yu Ebao quantitatively.Through the study found that: in order to attract customers,Yu Ebao may have the problem of interests behind the high profits;Yu Ebao transfer the profit to customers from bank through agreement deposits ,and to a certain extent also the risk on to their customers at the same time ;Yu Ebao has brought the revolutionary innovation to the financial industry is undeniable,but also brought the potential systemic risk.

  20. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  1. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  2. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  3. Towards a quantitative OCT image analysis.

    Directory of Open Access Journals (Sweden)

    Marina Garcia Garrido

    Full Text Available Optical coherence tomography (OCT is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study.Spectral-Domain Optical Coherence Tomography (OCT, confocal Scanning-Laser Ophthalmoscopy (SLO, and Fluorescein Angiography (FA were performed in mice (Mus musculus, gerbils (Gerbillus perpadillus, and cynomolgus monkeys (Macaca fascicularis using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer. Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/.Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP, the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated.OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions. Our results highlight the

  4. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  5. Qualitative and quantitative stability analysis of penta-rhythmic circuits

    Science.gov (United States)

    Schwabedal, Justus T. C.; Knapper, Drake E.; Shilnikov, Andrey L.

    2016-12-01

    Inhibitory circuits of relaxation oscillators are often-used models for dynamics of biological networks. We present a qualitative and quantitative stability analysis of such a circuit constituted by three generic oscillators (of a Fitzhugh-Nagumo type) as its nodes coupled reciprocally. Depending on inhibitory strengths, and parameters of individual oscillators, the circuit exhibits polyrhythmicity of up to five simultaneously stable rhythms. With methods of bifurcation analysis and phase reduction, we investigate qualitative changes in stability of these circuit rhythms for a wide range of parameters. Furthermore, we quantify robustness of the rhythms maintained under random perturbations by monitoring phase diffusion in the circuit. Our findings allow us to describe how circuit dynamics relate to dynamics of individual nodes. We also find that quantitative and qualitative stability properties of polyrhythmicity do not always align.

  6. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  7. An infinitesimal model for quantitative trait genomic value prediction.

    Directory of Open Access Journals (Sweden)

    Zhiqiu Hu

    Full Text Available We developed a marker based infinitesimal model for quantitative trait analysis. In contrast to the classical infinitesimal model, we now have new information about the segregation of every individual locus of the entire genome. Under this new model, we propose that the genetic effect of an individual locus is a function of the genome location (a continuous quantity. The overall genetic value of an individual is the weighted integral of the genetic effect function along the genome. Numerical integration is performed to find the integral, which requires partitioning the entire genome into a finite number of bins. Each bin may contain many markers. The integral is approximated by the weighted sum of all the bin effects. We now turn the problem of marker analysis into bin analysis so that the model dimension has decreased from a virtual infinity to a finite number of bins. This new approach can efficiently handle virtually unlimited number of markers without marker selection. The marker based infinitesimal model requires high linkage disequilibrium of all markers within a bin. For populations with low or no linkage disequilibrium, we develop an adaptive infinitesimal model. Both the original and the adaptive models are tested using simulated data as well as beef cattle data. The simulated data analysis shows that there is always an optimal number of bins at which the predictability of the bin model is much greater than the original marker analysis. Result of the beef cattle data analysis indicates that the bin model can increase the predictability from 10% (multiple marker analysis to 33% (multiple bin analysis. The marker based infinitesimal model paves a way towards the solution of genetic mapping and genomic selection using the whole genome sequence data.

  8. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.

    Science.gov (United States)

    Moray, Neville; Groeger, John; Stanton, Neville

    2017-02-01

    This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.

  9. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  10. Applying Knowledge of Quantitative Design and Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  11. Quantitative color analysis for capillaroscopy image segmentation.

    Science.gov (United States)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  12. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  13. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    Science.gov (United States)

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  14. Financial indicators for municipalities: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    Sreĉko Devjak

    2009-12-01

    Full Text Available From the characterization of Local Authority financing models and structures in Portugal and Slovenia, a set of financial and generic budget indicators has been established. These indicators may be used in a comparative analysis considering the Bragança District in Portugal, and municipalities of similar population size in Slovenia. The research identified significant differences, in terms of financing sources due to some discrepancies on financial models and competences of municipalities on each country. The results show that Portuguese and Slovenian municipalities, in 2003, for the economy indicator, had similar ranking behaviour, but in 2004, they changed this behaviour.

  15. Simulation Analysis of Quantitative Analysis Model of Personal Credit Risk Assessment in Commercial Banks%商业银行个人信贷风险评估的定量分析模型仿真分析

    Institute of Scientific and Technical Information of China (English)

    宋强; 刘洋

    2015-01-01

    商业银行个人信贷受到管理不到位、风险信息管理系统滞后等因素的影响,需要进行商业银行个人信贷风险评估的定量分析模型构建,提高商业银行信贷风险管理的能力。提出一种基于多层步进动态评估的商业银行个人信贷风险评估的定量分析模型。首先进行了商业银行个人信贷风险种类分析和影响因素分析,构建商业银行个人信贷的风险评估参量评价体系,根据先期的数据分析,计算得到商业银行个人信贷风险标准中的等级梯度值,进行商业银行个人信贷风险评估的定量分析模型优化设计。仿真结果表明,采用该模型进行商业银行个人信贷风险评估的定量分析,数据分析拟合结果准确,能有效抵御信贷风险,通过多渠道建立有效的损失补偿机制,提高商业银行的信贷风险防控能力。%Personal credit in commercial banks affected by factors such as not in place management and lagged behind risk information management system, it is necessary to build up the quantitative analysis model of personal credit risk as⁃sessment in commercial banks to improve their capacity in credit risk management. Therefore, a kind of quantitative anal⁃ysis model of personal credit risk assessment in commercial banks is proposed based on multilayer step-by-step dynamic assessment. First, the article analyzes the types and influence factors of commercial banks' personal credit risks, then es⁃ tablish the parameters evaluation system of personal credit risk assessment in commercial banks, according to preliminary data analysis, calculates the rank gradient value in the personal credit risk standard of commercial banks, then optimizes the design of quantitative analysis model for commercial banks' personal credit risk assessment. The simulation results show that adopting this model to proceed the quantitative analysis for personal credit risk assessment of commercial

  16. Toward quantitative modeling of silicon phononic thermocrystals

    Energy Technology Data Exchange (ETDEWEB)

    Lacatena, V. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France); IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Haras, M.; Robillard, J.-F., E-mail: jean-francois.robillard@isen.iemn.univ-lille1.fr; Dubois, E. [IEMN UMR CNRS 8520, Institut d' Electronique, de Microélectronique et de Nanotechnologie, Avenue Poincaré, F-59652 Villeneuve d' Ascq (France); Monfray, S.; Skotnicki, T. [STMicroelectronics, 850, rue Jean Monnet, F-38926 Crolles (France)

    2015-03-16

    The wealth of technological patterning technologies of deca-nanometer resolution brings opportunities to artificially modulate thermal transport properties. A promising example is given by the recent concepts of 'thermocrystals' or 'nanophononic crystals' that introduce regular nano-scale inclusions using a pitch scale in between the thermal phonons mean free path and the electron mean free path. In such structures, the lattice thermal conductivity is reduced down to two orders of magnitude with respect to its bulk value. Beyond the promise held by these materials to overcome the well-known “electron crystal-phonon glass” dilemma faced in thermoelectrics, the quantitative prediction of their thermal conductivity poses a challenge. This work paves the way toward understanding and designing silicon nanophononic membranes by means of molecular dynamics simulation. Several systems are studied in order to distinguish the shape contribution from bulk, ultra-thin membranes (8 to 15 nm), 2D phononic crystals, and finally 2D phononic membranes. After having discussed the equilibrium properties of these structures from 300 K to 400 K, the Green-Kubo methodology is used to quantify the thermal conductivity. The results account for several experimental trends and models. It is confirmed that the thin-film geometry as well as the phononic structure act towards a reduction of the thermal conductivity. The further decrease in the phononic engineered membrane clearly demonstrates that both phenomena are cumulative. Finally, limitations of the model and further perspectives are discussed.

  17. Identification of character-impact odorants in a cola-flavored carbonated beverage by quantitative analysis and omission studies of aroma reconstitution models.

    Science.gov (United States)

    Lorjaroenphon, Yaowapa; Cadwallader, Keith R

    2015-01-28

    Thirty aroma-active components of a cola-flavored carbonated beverage were quantitated by stable isotope dilution assays, and their odor activity values (OAVs) were calculated. The OAV results revealed that 1,8-cineole, (R)-(-)-linalool, and octanal made the greatest contribution to the overall aroma of the cola. A cola aroma reconstitution model was constructed by adding 20 high-purity standards to an aqueous sucrose-phosphoric acid solution. The results of headspace solid-phase microextraction and sensory analyses were used to adjust the model to better match authentic cola. The rebalanced model was used as a complete model for the omission study. Sensory results indicated that omission of a group consisting of methyleugenol, (E)-cinnamaldehyde, eugenol, and (Z)- and (E)-isoeugenols differed from the complete model, while omission of the individual components of this group did not differ from the complete model. These results indicate that a balance of numerous odorants is responsible for the characteristic aroma of cola-flavored carbonated beverages.

  18. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 2. Performance, Emissions, and Cost of Combustion-Based NOx Controls for Wall and Tangential Furnace Coal-Fired Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Tran, Loan K. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 2 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  19. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure.

    Science.gov (United States)

    Chen, Wen Hao; Yang, Sam Y S; Xiao, Ti Qiao; Mayo, Sherry C; Wang, Yu Dan; Wang, Hai Peng

    2014-05-01

    Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials.

  20. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    Science.gov (United States)

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  1. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...... time-to-event characteristic of interest. Real genetic longevity studies based on female animals of different species (sows, dairy cows, and sheep) exemplifies the use of the methods. Moreover these studies allow to understand som genetic mechanisms related to the lenght of the productive life...

  2. Chromatin immunoprecipitation: optimization, quantitative analysis and data normalization

    Directory of Open Access Journals (Sweden)

    Peterhansel Christoph

    2007-09-01

    Full Text Available Abstract Background Chromatin remodeling, histone modifications and other chromatin-related processes play a crucial role in gene regulation. A very useful technique to study these processes is chromatin immunoprecipitation (ChIP. ChIP is widely used for a few model systems, including Arabidopsis, but establishment of the technique for other organisms is still remarkably challenging. Furthermore, quantitative analysis of the precipitated material and normalization of the data is often underestimated, negatively affecting data quality. Results We developed a robust ChIP protocol, using maize (Zea mays as a model system, and present a general strategy to systematically optimize this protocol for any type of tissue. We propose endogenous controls for active and for repressed chromatin, and discuss various other controls that are essential for successful ChIP experiments. We experienced that the use of quantitative PCR (QPCR is crucial for obtaining high quality ChIP data and we explain why. The method of data normalization has a major impact on the quality of ChIP analyses. Therefore, we analyzed different normalization strategies, resulting in a thorough discussion of the advantages and drawbacks of the various approaches. Conclusion Here we provide a robust ChIP protocol and strategy to optimize the protocol for any type of tissue; we argue that quantitative real-time PCR (QPCR is the best method to analyze the precipitates, and present comprehensive insights into data normalization.

  3. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  4. The Curriculum in Quantitative Analysis: Results of a Survey.

    Science.gov (United States)

    Locke, David C.; Grossman, William E. L.

    1987-01-01

    Reports on the results of a survey of college level instructors of quantitative analysis courses. Discusses what topics are taught in such courses, how much weight is given to these topics, and which experiments are used in the laboratory. Poses some basic questions about the curriculum in quantitative analysis. (TW)

  5. Qualitative vs. quantitative software process simulation modelling: conversion and comparison

    OpenAIRE

    Zhang, He; Kitchenham, Barbara; Jeffery, Ross

    2009-01-01

    peer-reviewed Software Process Simulation Modeling (SPSM) research has increased in the past two decades. However, most of these models are quantitative, which require detailed understanding and accurate measurement. As the continuous work to our previous studies in qualitative modeling of software process, this paper aims to investigate the structure equivalence and model conversion between quantitative and qualitative process modeling, and to compare the characteristics and performance o...

  6. Quantitative modelling of the biomechanics of the avian syrinx

    NARCIS (Netherlands)

    Elemans, C.P.H.; Larsen, O.N.; Hoffmann, M.R.; Leeuwen, van J.L.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts

  7. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  8. Qualitative and quantitative analysis of the students’ perceptions to the use of 3D electronic models in problem-based learning

    Directory of Open Access Journals (Sweden)

    Hai Ming Wong

    2017-06-01

    Full Text Available Faculty of Dentistry of the University of Hong Kong has introduced innovative blended problem-based learning (PBL with the aid of 3D electronic models (e-models to Bachelor of Dental Surgery (BDS curriculum. Statistical results of pre- and post-semester questionnaire surveys illustrated compatibility of e-models in PBL settings. The students’ importance ratings of two objectives “Complete assigned tasks on time” and “Active listener”, and twenty-two facilitator evaluation items including critical thinking and group problem-solving skills had increased significantly. The students’ PBL preparation behavior, attentions to problem understanding, problem analysis, and learning resource quality were also found to be related to online support of e-models and its software. Qualitative analysis of open-ended questions with visual text analytic software “Leximancer” improved validity of statistical results. Using e-model functions in treatment planning, problem analysis and giving instructions provided a method of informative communication. Therefore, it is critical for the faculty to continuously provide facilitator training and quality online e-model resources to the students.

  9. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  10. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  11. A Team Mental Model Perspective of Pre-Quantitative Risk

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  12. Some Epistemological Considerations Concerning Quantitative Analysis

    Science.gov (United States)

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  13. Quantitative analysis of tumor burden in mouse lung via MRI.

    Science.gov (United States)

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  14. A mixed-model quantitative trait loci (QTL) analysis for multiple-environment trial data using environmental covariables for QTL-by-environment interactions, with an example in maize.

    Science.gov (United States)

    Boer, Martin P; Wright, Deanne; Feng, Lizhi; Podlich, Dean W; Luo, Lang; Cooper, Mark; van Eeuwijk, Fred A

    2007-11-01

    Complex quantitative traits of plants as measured on collections of genotypes across multiple environments are the outcome of processes that depend in intricate ways on genotype and environment simultaneously. For a better understanding of the genetic architecture of such traits as observed across environments, genotype-by-environment interaction should be modeled with statistical models that use explicit information on genotypes and environments. The modeling approach we propose explains genotype-by-environment interaction by differential quantitative trait locus (QTL) expression in relation to environmental variables. We analyzed grain yield and grain moisture for an experimental data set composed of 976 F(5) maize testcross progenies evaluated across 12 environments in the U.S. corn belt during 1994 and 1995. The strategy we used was based on mixed models and started with a phenotypic analysis of multi-environment data, modeling genotype-by-environment interactions and associated genetic correlations between environments, while taking into account intraenvironmental error structures. The phenotypic mixed models were then extended to QTL models via the incorporation of marker information as genotypic covariables. A majority of the detected QTL showed significant QTL-by-environment interactions (QEI). The QEI were further analyzed by including environmental covariates into the mixed model. Most QEI could be understood as differential QTL expression conditional on longitude or year, both consequences of temperature differences during critical stages of the growth.

  15. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  16. Estimation of prevalence of Salmonella on pig carcasses and pork joints, using a quantitative risk assessment model aided by meta-analysis.

    Science.gov (United States)

    Barron, Ursula Gonzales; Soumpasis, Ilias; Butler, Francis; Prendergast, Deirdre; Duggan, Sharon; Duffy, Geraldine

    2009-02-01

    This risk assessment study aimed to estimate the prevalence of Salmonella on pig carcasses and pork joints produced in slaughterhouses, on the basis that within groups of slaughter there is a strong association between the proportion of Salmonella-positive animals entering the slaughter lines (x) and the resulting proportion of contaminated eviscerated pig carcasses (y). To this effect, the results of a number of published studies reporting estimates of x and y were assembled in order to model a stochastic weighted regression considering the sensitivities of the diverse Salmonella culture methods. Meta-analysis was used to assign weights to the regression and to estimate the overall effect of chilling on Salmonella incidence on pig carcasses. The model's ability to produce accurate estimates and the intrinsic effectiveness of the modeling capabilities of meta-analysis were appraised using Irish data for the input parameter of prevalence of Salmonella carrier slaughter pigs. The model approximated a Salmonella prevalence in pork joints from Irish boning halls of 4.0% (95% confidence interval, 0.3 to 12.0%) and was validated by the results of a large survey (n = 720) of Salmonella in pork joints (mean, 3.3%; 95% confidence interval, 2.0 to 4.6%) carried out in four commercial pork abattoirs as part of this research project. Sensitivity analysis reinforced the importance of final rinsing (r = -0.382) and chilling (r = -0.221) as stages that contribute to reducing considerably the occurrence of Salmonella on the final product, while hygiene practices during jointing seemed to moderate only marginally the amount of contaminated pork joints. Finally, the adequacy of meta-analysis for integrating different findings and producing distributions for use in stochastic modeling was demonstrated.

  17. Triaxially deformed relativistic point-coupling model for $\\Lambda$ hypernuclei: a quantitative analysis of hyperon impurity effect on nuclear collective properties

    CERN Document Server

    Xue, W X; Hagino, K; Li, Z P; Mei, H; Tanimura, Y

    2014-01-01

    The impurity effect of hyperon on atomic nuclei has received a renewed interest in nuclear physics since the first experimental observation of appreciable reduction of $E2$ transition strength in low-lying states of hypernucleus $^{7}_\\Lambda$Li. Many more data on low-lying states of $\\Lambda$ hypernuclei will be measured soon for $sd$-shell nuclei, providing good opportunities to study the $\\Lambda$ impurity effect on nuclear low-energy excitations. We carry out a quantitative analysis of $\\Lambda$ hyperon impurity effect on the low-lying states of $sd$-shell nuclei at the beyond-mean-field level based on a relativistic point-coupling energy density functional (EDF), considering that the $\\Lambda$ hyperon is injected into the lowest positive-parity ($\\Lambda_s$) and negative-parity ($\\Lambda_p$) states. We adopt a triaxially deformed relativistic mean-field (RMF) approach for hypernuclei and calculate the $\\Lambda$ binding energies of hypernuclei as well as the potential energy surfaces (PESs) in $(\\beta, \\g...

  18. Qualitative and quantitative high performance thin layer chromatography analysis of Calendula officinalis using high resolution plate imaging and artificial neural network data modelling.

    Science.gov (United States)

    Agatonovic-Kustrin, S; Loescher, Christine M

    2013-10-10

    Calendula officinalis, commonly known Marigold, has been traditionally used for its anti-inflammatory effects. The aim of this study was to investigate the capacity of an artificial neural network (ANN) to analyse thin layer chromatography (TLC) chromatograms as fingerprint patterns for quantitative estimation of chlorogenic acid, caffeic acid and rutin in Calendula plant extracts. By applying samples with different weight ratios of marker compounds to the system, a database of chromatograms was constructed. A hundred and one signal intensities in each of the HPTLC chromatograms were correlated to the amounts of applied chlorogenic acid, caffeic acid, and rutin using an ANN. The developed ANN correlation was used to quantify the amounts of 3 marker compounds in calendula plant extracts. The minimum quantifiable level (MQL) of 610, 190 and 940 ng and the limit of detection (LD) of 183, 57 and 282 ng were established for chlorogenic, caffeic acid and rutin, respectively. A novel method for quality control of herbal products, based on HPTLC separation, high resolution digital plate imaging and ANN data analysis has been developed. The proposed method can be adopted for routine evaluation of the phytochemical variability in calendula extracts. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    Science.gov (United States)

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm.

  20. Quantitative analysis of the ribbon synapse number of cochlear inner hair cells in C57BL/6J mice using the three-dimensional modeling method

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In mammals,the ribbon synapses of cochlear inner hair cells are a synaptic structure of the first sensory nerve in the pathway of acoustical signal transmission to the acoustic center,and it is directly involved in voice coding and neurotransmitter release. It is difficult to quantitatively analyze the ribbon synaptic number only using an electron microscope,because the ribbon synaptic number is relatively limited and their location is deep. In this study,the specific presynaptic structure-RIBEYE,and non-specific postsynaptic structure-GluR 2 & 3 in C57BL/6J mouse basilar membrane samples were treated by immunofluorescent labeling. Serial section was performed on the samples using a laser scanning confocal microscope,and then the serial sections were used to build three-dimensional models using 3DS MAX software. Each fluorescein color pair indicates one synapse,so the number of ribbon synapses of inner hair cells is obtained. The spatial distribution and the number of ribbon synapses of cochlear inner hair cells were clearly shown in this experiment,and the mean number of ribbon synapses per inner hair cell was 16.10±1.03. Our results have demonstrated the number of ribbon synapses is accurately calculated by double immunofluorescent labeling to presynaptic and postsynaptic structures,serial sections obtained using a laser scanning confocal microscope,and three-dimensional modeling obtained using 3DS MAX software. The method above is feasible and has important significance for further exploring the mechanism of sensorineural deafness.

  1. Quantitative Analysis on the Energy and Environmental Impact of the Korean National Energy R&D Roadmap a Using Bottom-Up Energy System Model

    Directory of Open Access Journals (Sweden)

    Sang Jin Choi

    2017-03-01

    Full Text Available According to the Paris Agreement at the 21st Conference of the Parties, 196 member states are obliged to submit their Intended Nationally Determined Contributions (INDC for every 5 years. As a member, South Korea has already proposed the reduction target and need to submit the achievement as a result of the policies and endeavors in the near future. In this paper, a Korean bottom-up energy system model to support the low-carbon national energy R&D roadmap will be introduced and through the modeling of various scenarios, the mid-to long-term impact on energy consumptions and CO2 emissions will be analyzed as well. The results of the analysis showed that, assuming R&D investments for the 11 types of technologies, savings of 13.7% with regards to final energy consumptions compared to the baseline scenario would be feasible by 2050. Furthermore, in the field of power generation, the generation proportion of new and renewable energy is expected to increase from 3.0% as of 2011 to 19.4% by 2050. This research also suggested that the analysis on the Energy Technology R&D Roadmap based on the model can be used not only for overall impact analysis and R&D portfolio establishment, but also for the development of detailed R&D strategies.

  2. Structural and quantitative analysis of Equisetum alkaloids.

    Science.gov (United States)

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  3. Quantitative analysis of multiple sclerosis: a feasibility study

    Science.gov (United States)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  4. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Soichiro; Araki; Itaru; Nishioka; Yoshihiko; Suemura

    2003-01-01

    This paper proposes two migration scenarios from China ring networks to ASON mesh networks. In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  5. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Guoying Zhang; Soichiro Araki; Itaru Nishioka; Yoshihiko Suemura

    2003-01-01

    This paper proposes two migration scenarios from China rin g networks to ASON mesh networks . In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  6. Quantitative and qualitative analysis of sterols/sterolins and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-06-03

    Jun 3, 2008 ... Quantitative and qualitative analysis of sterols/sterolins ... method was developed to identify and quantify sterols (especially β-sitosterol) in chloroform extracts of ... Studies with phytosterols, especially β-sitosterol, have.

  7. The Impact of Arithmetic Skills on Mastery of Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bruce K. Blaylock

    2012-01-01

    Full Text Available Over the past several years math education has moved from a period where all math calculations were done by hand to an era where most calculations are done using a calculator or computer. There are certainly benefits to this approach, but when one concomitantly recognizes the declining scores on national standardized mathematics exams, it raises the question, “Could the lack of technology-assisted arithmetic manipulation skills have a carryover to understanding higher-level mathematical concepts or is it just a spurious correlation?” Eighty-seven students were tested for their ability to do simple arithmetic and algebra by hand. These scores were then regressed on three important areas of quantitative analysis: recognizing the appropriate tool to use in an analysis, creating a model to carry out the analysis, and interpreting the results of the analysis. The study revealed a significant relationship between the ability to accurately do arithmetic calculations and the ability to recognize the appropriate tool and creating a model. It found no significant relationship between results interpretation and arithmetic skills.

  8. Glioblastoma multiforme: exploratory radiogenomic analysis by using quantitative image features.

    Science.gov (United States)

    Gevaert, Olivier; Mitchell, Lex A; Achrol, Achal S; Xu, Jiajing; Echegaray, Sebastian; Steinberg, Gary K; Cheshier, Samuel H; Napel, Sandy; Zaharchuk, Greg; Plevritis, Sylvia K

    2014-10-01

    To derive quantitative image features from magnetic resonance (MR) images that characterize the radiographic phenotype of glioblastoma multiforme (GBM) lesions and to create radiogenomic maps associating these features with various molecular data. Clinical, molecular, and MR imaging data for GBMs in 55 patients were obtained from the Cancer Genome Atlas and the Cancer Imaging Archive after local ethics committee and institutional review board approval. Regions of interest (ROIs) corresponding to enhancing necrotic portions of tumor and peritumoral edema were drawn, and quantitative image features were derived from these ROIs. Robust quantitative image features were defined on the basis of an intraclass correlation coefficient of 0.6 for a digital algorithmic modification and a test-retest analysis. The robust features were visualized by using hierarchic clustering and were correlated with survival by using Cox proportional hazards modeling. Next, these robust image features were correlated with manual radiologist annotations from the Visually Accessible Rembrandt Images (VASARI) feature set and GBM molecular subgroups by using nonparametric statistical tests. A bioinformatic algorithm was used to create gene expression modules, defined as a set of coexpressed genes together with a multivariate model of cancer driver genes predictive of the module's expression pattern. Modules were correlated with robust image features by using the Spearman correlation test to create radiogenomic maps and to link robust image features with molecular pathways. Eighteen image features passed the robustness analysis and were further analyzed for the three types of ROIs, for a total of 54 image features. Three enhancement features were significantly correlated with survival, 77 significant correlations were found between robust quantitative features and the VASARI feature set, and seven image features were correlated with molecular subgroups (P < .05 for all). A radiogenomics map was

  9. Quantitative analysis of polymorphic mixtures of ranitidine hydrochloride by Raman spectroscopy and principal components analysis.

    Science.gov (United States)

    Pratiwi, Destari; Fawcett, J Paul; Gordon, Keith C; Rades, Thomas

    2002-11-01

    Ranitidine hydrochloride exists as two polymorphs, forms I and II, both of which are used to manufacture commercial tablets. Raman spectroscopy can be used to differentiate the two forms but univariate methods of quantitative analysis of one polymorph as an impurity in the other lack sensitivity. We have applied principal components analysis (PCA) of Raman spectra to binary mixtures of the two polymorphs and to binary mixtures prepared by adding one polymorph to powdered tablets of the other. Based on absorption measurements of seven spectral regions, it was found that >97% of the spectral variation was accounted for by three principal components. Quantitative calibration models generated by multiple linear regression predicted a detection limit and quantitation limit for either forms I or II in mixtures of the two of 0.6 and 1.8%, respectively. This study demonstrates that PCA of Raman spectroscopic data provides a sensitive method for the quantitative analysis of polymorphic impurities of drugs in commercial tablets with a quantitation limit of less than 2%.

  10. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  11. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  12. Research on Petroleum Reservoir Diagenesis and Damage Using EDS Quantitative Analysis Method With Standard Samples

    Institute of Scientific and Technical Information of China (English)

    包书景; 陈文学; 等

    2000-01-01

    In recent years,the X-ray spectrometer has been devekloped not only just in enhancing resolution,but also towards dynamic analysis.Computer modeling processing,sampled quantitative analysis and supra-light element analysis.With the gradual sophistication of the quantitative analysis system software,the rationality and accuracy of the established sample deferential document have become the most important guarantee to the reliability of sample quantitative analysis.This work is an important technical subject in China Petroleum Reservoir Research.Through two years of research and experimental work,the EDS quantitative analysis method for petroleum geolgey and resevoir research has been established.and referential documents for five mineral(silicate,etc).specimen standards have been compiled.Closely combining the shape characters and compositional characters of the minerals together and applying them into reservoir diagenetic research and prevention of oil formations from damage,we have obtained obvious geological effects.

  13. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  14. Use of MRI in Differentiation of Papillary Renal Cell Carcinoma Subtypes: Qualitative and Quantitative Analysis.

    Science.gov (United States)

    Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh

    2016-03-01

    The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.

  15. Quantitative analysis and parametric display of regional myocardial mechanics

    Science.gov (United States)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in

  16. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  17. Quantitative analysis of regulatory flexibility under changing environmental conditions

    Science.gov (United States)

    Edwards, Kieron D; Akman, Ozgur E; Knox, Kirsten; Lumsden, Peter J; Thomson, Adrian W; Brown, Paul E; Pokhilko, Alexandra; Kozma-Bognar, Laszlo; Nagy, Ferenc; Rand, David A; Millar, Andrew J

    2010-01-01

    The circadian clock controls 24-h rhythms in many biological processes, allowing appropriate timing of biological rhythms relative to dawn and dusk. Known clock circuits include multiple, interlocked feedback loops. Theory suggested that multiple loops contribute the flexibility for molecular rhythms to track multiple phases of the external cycle. Clear dawn- and dusk-tracking rhythms illustrate the flexibility of timing in Ipomoea nil. Molecular clock components in Arabidopsis thaliana showed complex, photoperiod-dependent regulation, which was analysed by comparison with three contrasting models. A simple, quantitative measure, Dusk Sensitivity, was introduced to compare the behaviour of clock models with varying loop complexity. Evening-expressed clock genes showed photoperiod-dependent dusk sensitivity, as predicted by the three-loop model, whereas the one- and two-loop models tracked dawn and dusk, respectively. Output genes for starch degradation achieved dusk-tracking expression through light regulation, rather than a dusk-tracking rhythm. Model analysis predicted which biochemical processes could be manipulated to extend dusk tracking. Our results reveal how an operating principle of biological regulators applies specifically to the plant circadian clock. PMID:21045818

  18. Quantitative analysis of thermal insulation coatings

    DEFF Research Database (Denmark)

    Kiil, Søren

    2014-01-01

    This work concerns the development of simulation tools for mapping of insulation properties of thermal insulation coatings based on selected functional filler materials. A mathematical model, which includes the underlying physics (i.e. thermal conductivity of a heterogeneous two-component coating...... and porosity and thermal conductivity of selected fillers) was recently developed. The model has been validated against data from a previous experimental investigation with hollow glass sphere-based epoxy and acrylic coatings. In this presentation, a concise introduction to the model and some of the simulation...

  19. Quantitative genetic analysis of injury liability in infants and toddlers

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, K.; Matheny, A.P. Jr. [Univ. of Louisville Medical School, KY (United States)

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  20. Quantitative image analysis of WE43-T6 cracking behavior

    Science.gov (United States)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  1. Quantitative magnetospheric models: results and perspectives.

    Science.gov (United States)

    Kuznetsova, M.; Hesse, M.; Gombosi, T.; Csem Team

    Global magnetospheric models are indispensable tool that allow multi-point measurements to be put into global context Significant progress is achieved in global MHD modeling of magnetosphere structure and dynamics Medium resolution simulations confirm general topological pictures suggested by Dungey State of the art global models with adaptive grids allow performing simulations with highly resolved magnetopause and magnetotail current sheet Advanced high-resolution models are capable to reproduced transient phenomena such as FTEs associated with formation of flux ropes or plasma bubbles embedded into magnetopause and demonstrate generation of vortices at magnetospheric flanks On the other hand there is still controversy about the global state of the magnetosphere predicted by MHD models to the point of questioning the length of the magnetotail and the location of the reconnection sites within it For example for steady southwards IMF driving condition resistive MHD simulations produce steady configuration with almost stationary near-earth neutral line While there are plenty of observational evidences of periodic loading unloading cycle during long periods of southward IMF Successes and challenges in global modeling of magnetispheric dynamics will be addessed One of the major challenges is to quantify the interaction between large-scale global magnetospheric dynamics and microphysical processes in diffusion regions near reconnection sites Possible solutions to controversies will be discussed

  2. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  3. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Quantitative analysis of thermal insulation coatings

    DEFF Research Database (Denmark)

    Kiil, Søren

    2014-01-01

    This work concerns the development of simulation tools for mapping of insulation properties of thermal insulation coatings based on selected functional filler materials. A mathematical model, which includes the underlying physics (i.e. thermal conductivity of a heterogeneous two-component coating...... and porosity and thermal conductivity of selected fillers) was recently developed. The model has been validated against data from a previous experimental investigation with hollow glass sphere-based epoxy and acrylic coatings. In this presentation, a concise introduction to the model and some of the simulation...... results are provided. A practical case story with an insulation coating applied to a hot water pipe is included. Further development of the simulation tool to other types of fillers will be shortly discussed....

  5. A Quantitative Model to Estimate Drug Resistance in Pathogens

    Directory of Open Access Journals (Sweden)

    Frazier N. Baker

    2016-12-01

    Full Text Available Pneumocystis pneumonia (PCP is an opportunistic infection that occurs in humans and other mammals with debilitated immune systems. These infections are caused by fungi in the genus Pneumocystis, which are not susceptible to standard antifungal agents. Despite decades of research and drug development, the primary treatment and prophylaxis for PCP remains a combination of trimethoprim (TMP and sulfamethoxazole (SMX that targets two enzymes in folic acid biosynthesis, dihydrofolate reductase (DHFR and dihydropteroate synthase (DHPS, respectively. There is growing evidence of emerging resistance by Pneumocystis jirovecii (the species that infects humans to TMP-SMX associated with mutations in the targeted enzymes. In the present study, we report the development of an accurate quantitative model to predict changes in the binding affinity of inhibitors (Ki, IC50 to the mutated proteins. The model is based on evolutionary information and amino acid covariance analysis. Predicted changes in binding affinity upon mutations highly correlate with the experimentally measured data. While trained on Pneumocystis jirovecii DHFR/TMP data, the model shows similar or better performance when evaluated on the resistance data for a different inhibitor of PjDFHR, another drug/target pair (PjDHPS/SMX and another organism (Staphylococcus aureus DHFR/TMP. Therefore, we anticipate that the developed prediction model will be useful in the evaluation of possible resistance of the newly sequenced variants of the pathogen and can be extended to other drug targets and organisms.

  6. Quantitative timed analysis of interactive Markov chains

    NARCIS (Netherlands)

    Guck, Dennis; Han, Tingting; Katoen, Joost-Pieter; Neuhausser, M.

    2012-01-01

    This paper presents new algorithms and accompanying tool support for analyzing interactive Markov chains (IMCs), a stochastic timed 1 1/2-player game in which delays are exponentially distributed. IMCs are compositional and act as semantic model for engineering formalisms such as AADL and dynamic fa

  7. Quantitative analysis of phenylalanine, tyrosine, tryptophan and kynurenine in rat model for tauopathies by ultra-high performance liquid chromatography with fluorescence and mass spectrometry detection.

    Science.gov (United States)

    Galba, Jaroslav; Michalicova, Alena; Parrak, Vojtech; Novak, Michal; Kovac, Andrej

    2016-01-01

    We developed and validated a simple and sensitive ultra-high performance liquid chromatography (UHPLC) method for the analysis of phenylalanine (Phe), tyrosine (Tyr), tryptophan (Trp) and kynurenine (Kyn) in rat plasma. Analytes were separated on Acquity UPLC HSS T3 column (2.1 mm×50 mm, 1.8 μm particle size) using a 4 min ammonium acetate (pH 5) gradient and detected by fluorescence and positive ESI mass spectrometry. Sample preparation involved dilution of plasma, deproteinization by trichloroacetic acid and centrifugation. The procedure was validated in compliance with the FDA guideline. The limits of quantification (LOQ) were 0.3 μM for Kyn and from 1.5 to 3 μM for Phe, Tyr, Trp. The method showed excellent linearity with regression coefficients higher than 0.99. The accuracy was within the range of 86-108%. The inter-day precision (n=5 days), expressed as % RSD, was in the range 1-13%. The benefit of using UHPLC is a short analysis period and thus, a very good sample throughput. Using this method, we analyzed plasma samples and detected significant changes of Kyn and Phe in transgenic rat model for tauopathies.

  8. Hazard Response Modeling Uncertainty (A Quantitative Method)

    Science.gov (United States)

    1988-10-01

    ersio 114-11aiaiI I I I II L I ATINI Iri Ig IN Ig - ISI I I s InWLS I I I II I I I IWILLa RguOSmI IT$ INDS In s list INDIN I Im Ad inla o o ILLS I...OesotASII II I I I" GASau ATI im IVS ES Igo Igo INC 9 -U TIg IN ImS. I IgoIDI II i t I I ol f i isI I I I I I * WOOL ETY tGMIM (SU I YESMI jWM# GUSA imp I...is the concentration predicted by some component or model.P The variance of C /C is calculated and defined as var(Model I), where Modelo p I could be

  9. Quantitative analysis of cascade impactor samples - revisited

    Science.gov (United States)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  10. Quantitative analysis of Li by PIGE technique

    Science.gov (United States)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  11. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  12. Analysis of generalized interictal discharges using quantitative EEG.

    Science.gov (United States)

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  13. A Qualitative and Quantitative Evaluation of 8 Clear Sky Models.

    Science.gov (United States)

    Bruneton, Eric

    2016-10-27

    We provide a qualitative and quantitative evaluation of 8 clear sky models used in Computer Graphics. We compare the models with each other as well as with measurements and with a reference model from the physics community. After a short summary of the physics of the problem, we present the measurements and the reference model, and how we "invert" it to get the model parameters. We then give an overview of each CG model, and detail its scope, its algorithmic complexity, and its results using the same parameters as in the reference model. We also compare the models with a perceptual study. Our quantitative results confirm that the less simplifications and approximations are used to solve the physical equations, the more accurate are the results. We conclude with a discussion of the advantages and drawbacks of each model, and how to further improve their accuracy.

  14. Quantitative MRI analysis of dynamic enhancement of focal liver lesions

    Directory of Open Access Journals (Sweden)

    S. S. Bagnenko

    2012-01-01

    Full Text Available In our study 45 patients with different focal liver lesions (110 nodules were examined using high field MR-system (1,5 T. During this investigation quantitative MRI analysis of dynamic enhancement of various hepatic lesions and parenchymatous organs of abdomen were performed. It was shown that quantitative evaluation of enhanced MRI improves understanding of vascular transformation processes in pathologic hepatic focuses and in liver itself that is important for differential diagnoses of these diseases.

  15. A quantitative model of technological catch-up

    Directory of Open Access Journals (Sweden)

    Hossein Gholizadeh

    2015-02-01

    Full Text Available This article presents a quantitative model for the analysis of technological gap. The rates of development of technological leaders and followers in nanotechnology are expressed in terms of coupled equations. On the basis of this model (first step comparative technological gap and rate of that will be studied. We can calculate the dynamics of the gap between leader and follower. In the Second step, we estimate the technology gap using the metafrontier approach. Then we test the relationship between the technology gap and the quality of dimensions of the Catch-up technology which were identified in previous step. The usefulness of this approach is then demonstrated in the analysis of the technological gap of nanotechnology in Iran, the leader in Middle East and the world. We shall present the behaviors of the technological leader and followers. At the end, analyzing Iran position will be identified and implying effective dimension of catch-up Suggestions will be offered which could be a fundamental for long-term policies of Iran.

  16. A Solved Model to Show Insufficiency of Quantitative Adiabatic Condition

    Institute of Scientific and Technical Information of China (English)

    LIU Long-Jiang; LIU Yu-Zhen; TONG Dian-Min

    2009-01-01

    The adiabatic theorem is a useful tool in processing quantum systems slowly evolving,but its practical application depends on the quantitative condition expressed by Hamiltonian's eigenvalues and eigenstates,which is usually taken as a sufficient condition.Recently,the sumciency of the condition was questioned,and several counterex amples have been reported.Here we present a new solved model to show the insufficiency of the traditional quantitative adiabatic condition.

  17. QUANTITATIVE ANALYSIS OF DRAWING TUBES MICROSTRUCTURE

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2009-05-01

    Full Text Available Final properties of forming pieces are affected by production, at first conditions of mechanical working. Application of stereology methods to statistic reconstruction of three-dimensional plastic deformed material structure by bulk forming led to detail analysis of material structure changes. The microstructure of cold drawing tubes from STN 411353 steel was analyzed. Grain boundaries orientation was measured on perpendicular and parallel section of tubes with different degree of deformation. Macroscopic deformation leads to grain boundaries deformation and these ones were compared.

  18. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  19. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  20. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  1. ASSETS MANAGEMENT - A CONCEPTUAL MODEL DECOMPOSING VALUE FOR THE CUSTOMER AND A QUANTITATIVE MODEL

    Directory of Open Access Journals (Sweden)

    Susana Nicola

    2015-03-01

    Full Text Available In this paper we describe de application of a modeling framework, the so-called Conceptual Model Decomposing Value for the Customer (CMDVC, in a Footwear Industry case study, to ascertain the usefulness of this approach. The value networks were used to identify the participants, both tangible and intangible deliverables/endogenous and exogenous assets, and the analysis of their interactions as the indication for an adequate value proposition. The quantitative model of benefits and sacrifices, using the Fuzzy AHP method, enables the discussion of how the CMDVC can be applied and used in the enterprise environment and provided new relevant relations between perceived benefits (PBs.

  2. Quantitative analysis of binding sites for 9-fluoropropyl-(+)-dihydrotetrabenazine ([¹⁸F]AV-133) in a MPTP-lesioned PD mouse model.

    Science.gov (United States)

    Chao, Ko-Ting; Tsao, Hsin-Hsin; Weng, Yi-Hsin; Hsiao, Ing-Tsung; Hsieh, Chia-Ju; Wey, Shiaw-Pyng; Yen, Tzu-Chen; Kung, Mei-Ping; Lin, Kun-Ju

    2012-09-01

    [¹⁸F]AV-133 is a novel PET tracer for targeting the vesicular monoamine transporter 2 (VMAT2). The aim of this study is to characterize and quantify the loss of monoamine neurons with [¹⁸F]AV-133 in the MPTP-lesioned PD mouse model using animal PET imaging and ex vivo quantitative autoradiography (QARG). Optimal imaging time window of [¹⁸F]AV-133 was first determined in normal C57BL/6 mice (n = 3) with a 90-min dynamic scan. The reproducibility of [¹⁸F]AV-133 PET imaging was evaluated by performing a test-retest study within 1 week for the normal group (n = 6). For MPTP-lesioned studies, normal, and MPTP-treated [25 mg mg/kg once (Group A) and twice (Group B), respectively, daily for 5 days, i.p., groups of four normal and MPTP-treated] mice were used. PET imaging studies at baseline and at Day 4 post-MPTP injections were performed at the optimal time window after injection of 11.1 MBq [¹⁸F]AV-133. Specific uptake ratio (SUr) of [¹⁸F]AV-133 was calculated by [(target uptake-cerebellar uptake)/cerebellar uptake] with cerebellum as the reference region. Ex vitro QARG and immunohistochemistry (IHC) studies with tyrosine hydroxylase antibody were carried out to confirm the abundance of dopaminergic neurons. The variability between [¹⁸F]AV-133 test-retest striatal SUr was 6.60 ± 3.61% with less than 5% standard deviation between animals (intervariability). The percentages of MPTP lesions were Group A 0.94 ± 0.29, -42.1% and Group B 0.65 ± 0.09, -60.4%. By QARG, specific binding of [¹⁸F]AV-133 was reduced relative to the control groups by 50.6% and 60.7% in striatum and by 30.6% and 46.4% in substantia nigra (Groups A and B, respectively). Relatively small [¹⁸F]AV-133 SUr decline was noted in the serotonin and norepinephrine-enriched regions (7.9% and 9.4% in mid-brain). Results obtained from IHC consistently confirmed the sensitivity and selectivity of dopaminergic neuron loss after MPTP treatment. [¹⁸F]AV-133 PET SUr displayed a high

  3. Quantitation and gompertzian analysis of tumor growth

    DEFF Research Database (Denmark)

    Rygaard, K; Spang-Thomsen, M

    1998-01-01

    Human tumor xenografts in immune-deficient animals are used to establish tumor growth curves and for studying the effect of experimental therapy on tumor growth. In this review we describe a method for making serial measurements of tumor size in the nude mouse model as well as methods used...... to transform the experimental data into useful growth curves. A transformed Gompertz function is used as the basis for calculating relevant parameters pertaining to tumor growth and response to therapy. The calculations are facilitated by use of a computer program which performs the necessary calculations...

  4. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  5. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  6. Quantitative polymerase chain reaction analysis by deconvolution of internal standard.

    Science.gov (United States)

    Hirakawa, Yasuko; Medh, Rheem D; Metzenberg, Stan

    2010-04-29

    Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  7. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    Directory of Open Access Journals (Sweden)

    Metzenberg Stan

    2010-04-01

    Full Text Available Abstract Background Quantitative Polymerase Chain Reaction (qPCR is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  8. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  9. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  10. Modeling quantitative phase image formation under tilted illuminations.

    Science.gov (United States)

    Bon, Pierre; Wattellier, Benoit; Monneret, Serge

    2012-05-15

    A generalized product-of-convolution model for simulation of quantitative phase microscopy of thick heterogeneous specimen under tilted plane-wave illumination is presented. Actual simulations are checked against a much more time-consuming commercial finite-difference time-domain method. Then modeled data are compared with experimental measurements that were made with a quadriwave lateral shearing interferometer.

  11. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  12. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  13. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  14. Generalized PSF modeling for optimized quantitation in PET imaging

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-ud-Din, Hassan; Karakatsanis, Nicolas A.; Jha, Abhinav K.; Casey, Michael E.; Kadrmas, Dan J.; Rahmim, Arman

    2017-06-01

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  15. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  16. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  17. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Science.gov (United States)

    Wandinger, Sebastian K; Lahortiga, Idoya; Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T M; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  18. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Directory of Open Access Journals (Sweden)

    Sebastian K Wandinger

    Full Text Available The four members of the epidermal growth factor receptor (EGFR/ERBB family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1 treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  19. Accuracy of Image Analysis in Quantitative Study of Cement Paste

    Directory of Open Access Journals (Sweden)

    Feng Shu-Xia

    2016-01-01

    Full Text Available Quantitative study on cement paste especially blended cement paste has been a hot and difficult issue over the years, and the technique of backscattered electron image analysis showed unique advantages in this field. This paper compared the test results of cement hydration degree, Ca(OH2 content and pore size distribution in pure pastes by image analysis and other methods. Then the accuracy of qualitative study by image analysis was analyzed. The results showed that image analysis technique had displayed higher accuracy in quantifying cement hydration degree and Ca(OH2 content than non-evaporable water test and thermal analysis respectively.

  20. Quantitative analysis of microtubule transport in growing nerve processes

    DEFF Research Database (Denmark)

    Ma*, Ytao; Shakiryanova*, Dinara; Vardya, Irina;

    2004-01-01

    the translocation of MT plus ends in the axonal shaft by expressing GFP-EB1 in Xenopus embryo neurons in culture. Formal quantitative analysis of MT assembly/disassembly indicated that none of the MTs in the axonal shaft were rapidly transported. Our results suggest that transport of axonal MTs is not required...

  1. Insights Into Quantitative Biology: analysis of cellular adaptation

    OpenAIRE

    Agoni, Valentina

    2013-01-01

    In the last years many powerful techniques have emerged to measure protein interactions as well as gene expression. Many progresses have been done since the introduction of these techniques but not toward quantitative analysis of data. In this paper we show how to study cellular adaptation and how to detect cellular subpopulations. Moreover we go deeper in analyzing signal transduction pathways dynamics.

  2. Quantitating the subtleties of microglial morphology with fractal analysis.

    Science.gov (United States)

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  3. Quantitating the Subtleties of Microglial Morphology with Fractal Analysis

    Directory of Open Access Journals (Sweden)

    Audrey eKarperien

    2013-01-01

    Full Text Available It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells. Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  4. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage ti

  5. Quantitative modelling in design and operation of food supply systems

    NARCIS (Netherlands)

    Beek, van P.

    2004-01-01

    During the last two decades food supply systems not only got interest of food technologists but also from the field of Operations Research and Management Science. Operations Research (OR) is concerned with quantitative modelling and can be used to get insight into the optimal configuration and opera

  6. Quantitative Analysis of Complex Drug-Drug Interactions Between Repaglinide and Cyclosporin A/Gemfibrozil Using Physiologically Based Pharmacokinetic Models With In Vitro Transporter/Enzyme Inhibition Data.

    Science.gov (United States)

    Kim, Soo-Jin; Toshimoto, Kota; Yao, Yoshiaki; Yoshikado, Takashi; Sugiyama, Yuichi

    2017-09-01

    Quantitative analysis of transporter- and enzyme-mediated complex drug-drug interactions (DDIs) is challenging. Repaglinide (RPG) is transported into the liver by OATP1B1 and then is metabolized by CYP2C8 and CYP3A4. The purpose of this study was to describe the complex DDIs of RPG quantitatively based on unified physiologically based pharmacokinetic (PBPK) models using in vitro Ki values for OATP1B1, CYP3A4, and CYP2C8. Cyclosporin A (CsA) or gemfibrozil (GEM) increased the blood concentrations of RPG. The time profiles of RPG and the inhibitors were analyzed by PBPK models, considering the inhibition of OATP1B1 and CYP3A4 by CsA or OATP1B1 inhibition by GEM and its glucuronide and the mechanism-based inhibition of CYP2C8 by GEM glucuronide. RPG-CsA interaction was closely predicted using a reported in vitro Ki,OATP1B1 value in the presence of CsA preincubation. RPG-GEM interaction was underestimated compared with observed data, but the simulation was improved with the increase of fm,CYP2C8. These results based on in vitro Ki values for transport and metabolism suggest the possibility of a bottom-up approach with in vitro inhibition data for the prediction of complex DDIs using unified PBPK models and in vitro fm value of a substrate for multiple enzymes should be considered carefully for the prediction. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Quantitative security analysis for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Tri Minh; Huisman, Marieke

    2013-01-01

    Quantitative theories of information flow give us an approach to relax the absolute confidentiality properties that are difficult to satisfy for many practical programs. The classical information-theoretic approaches for sequential programs, where the program is modeled as a communication channel wi

  8. Fuzzy Logic as a Computational Tool for Quantitative Modelling of Biological Systems with Uncertain Kinetic Data.

    Science.gov (United States)

    Bordon, Jure; Moskon, Miha; Zimic, Nikolaj; Mraz, Miha

    2015-01-01

    Quantitative modelling of biological systems has become an indispensable computational approach in the design of novel and analysis of existing biological systems. However, kinetic data that describe the system's dynamics need to be known in order to obtain relevant results with the conventional modelling techniques. These data are often hard or even impossible to obtain. Here, we present a quantitative fuzzy logic modelling approach that is able to cope with unknown kinetic data and thus produce relevant results even though kinetic data are incomplete or only vaguely defined. Moreover, the approach can be used in the combination with the existing state-of-the-art quantitative modelling techniques only in certain parts of the system, i.e., where kinetic data are missing. The case study of the approach proposed here is performed on the model of three-gene repressilator.

  9. Quantitative numerical analysis of transient IR-experiments on buildings

    Science.gov (United States)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  10. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  11. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  12. A GPGPU accelerated modeling environment for quantitatively characterizing karst systems

    Science.gov (United States)

    Myre, J. M.; Covington, M. D.; Luhmann, A. J.; Saar, M. O.

    2011-12-01

    The ability to derive quantitative information on the geometry of karst aquifer systems is highly desirable. Knowing the geometric makeup of a karst aquifer system enables quantitative characterization of the systems response to hydraulic events. However, the relationship between flow path geometry and karst aquifer response is not well understood. One method to improve this understanding is the use of high speed modeling environments. High speed modeling environments offer great potential in this regard as they allow researchers to improve their understanding of the modeled karst aquifer through fast quantitative characterization. To that end, we have implemented a finite difference model using General Purpose Graphics Processing Units (GPGPUs). GPGPUs are special purpose accelerators which are capable of high speed and highly parallel computation. The GPGPU architecture is a grid like structure, making it is a natural fit for structured systems like finite difference models. To characterize the highly complex nature of karst aquifer systems our modeling environment is designed to use an inverse method to conduct the parameter tuning. Using an inverse method reduces the total amount of parameter space needed to produce a set of parameters describing a system of good fit. Systems of good fit are determined with a comparison to reference storm responses. To obtain reference storm responses we have collected data from a series of data-loggers measuring water depth, temperature, and conductivity at locations along a cave stream with a known geometry in southeastern Minnesota. By comparing the modeled response to those of the reference responses the model parameters can be tuned to quantitatively characterize geometry, and thus, the response of the karst system.

  13. Safety Quantitative Analysis Model Based on Explosion Hazard Assessment Method%基于燃爆危险源评估法的安全性定量分析模型

    Institute of Scientific and Technical Information of China (English)

    王耀华; 王亮; 王云峰; 杨小强; 武光华

    2011-01-01

    Because the existing explosion hazard assessment method was not fully applicable to the safety quantitative analysis of the emergency opening system, we put forward an safety quantitative analysis model based on explosion hazard assessment method. The characteristics of the emergency opening system itself and its aviation environment was considered in the model,and the existing assessment model was simplified and distorted reasonably. The re-established model could analyze the accident consequences severity and accident probability, it was facilitative to access parameters and the calculating was simple. By using of the model,the hazard degree compliance status of the emergency opening system was calculated, which of before and after the safety improvement measures being taken, and it had a good consistency with the actual situation.%已有的燃爆危险源评估法不能完全适用于应急开启系统安全性定量分析,因此提出基于燃爆危险源评估法的应急开启系统安全性定量分析模型.该模型在考虑了应急开启系统自身及所处航空环境特点情形下,对原有的评估模型进行合理简化和变形.重新建立的模型从事故发生概率和事故后果严重程度两方面进行分析,具有获取参量方便、计算过程简单的特点.运用该模型,计算了采取安全改造措施前后应急开启系统危险度的达标情况,与实际情况具有较好的一致性.由此可见,该模型为系统的安全性设计方案的选择提供了一种定量分析方法.

  14. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    Science.gov (United States)

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  15. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  16. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  17. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    Science.gov (United States)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  18. Reservoir Stochastic Modeling Constrained by Quantitative Geological Conceptual Patterns

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper discusses the principles of geologic constraints on reservoir stochastic modeling. By using the system science theory, two kinds of uncertainties, including random uncertainty and fuzzy uncertainty, are recognized. In order to improve the precision of stochastic modeling and reduce the uncertainty in realization, the fuzzy uncertainty should be stressed, and the "geological genesis-controlled modeling" is conducted under the guidance of a quantitative geological pattern. An example of the Pingqiao horizontal-well division of the Ansai Oilfield in the Ordos Basin is taken to expound the method of stochastic modeling.

  19. Lessons Learned from Quantitative Dynamical Modeling in Systems Biology

    Science.gov (United States)

    Bachmann, Julie; Matteson, Andrew; Schelke, Max; Kaschek, Daniel; Hug, Sabine; Kreutz, Clemens; Harms, Brian D.; Theis, Fabian J.; Klingmüller, Ursula; Timmer, Jens

    2013-01-01

    Due to the high complexity of biological data it is difficult to disentangle cellular processes relying only on intuitive interpretation of measurements. A Systems Biology approach that combines quantitative experimental data with dynamic mathematical modeling promises to yield deeper insights into these processes. Nevertheless, with growing complexity and increasing amount of quantitative experimental data, building realistic and reliable mathematical models can become a challenging task: the quality of experimental data has to be assessed objectively, unknown model parameters need to be estimated from the experimental data, and numerical calculations need to be precise and efficient. Here, we discuss, compare and characterize the performance of computational methods throughout the process of quantitative dynamic modeling using two previously established examples, for which quantitative, dose- and time-resolved experimental data are available. In particular, we present an approach that allows to determine the quality of experimental data in an efficient, objective and automated manner. Using this approach data generated by different measurement techniques and even in single replicates can be reliably used for mathematical modeling. For the estimation of unknown model parameters, the performance of different optimization algorithms was compared systematically. Our results show that deterministic derivative-based optimization employing the sensitivity equations in combination with a multi-start strategy based on latin hypercube sampling outperforms the other methods by orders of magnitude in accuracy and speed. Finally, we investigated transformations that yield a more efficient parameterization of the model and therefore lead to a further enhancement in optimization performance. We provide a freely available open source software package that implements the algorithms and examples compared here. PMID:24098642

  20. Lessons learned from quantitative dynamical modeling in systems biology.

    Directory of Open Access Journals (Sweden)

    Andreas Raue

    Full Text Available Due to the high complexity of biological data it is difficult to disentangle cellular processes relying only on intuitive interpretation of measurements. A Systems Biology approach that combines quantitative experimental data with dynamic mathematical modeling promises to yield deeper insights into these processes. Nevertheless, with growing complexity and increasing amount of quantitative experimental data, building realistic and reliable mathematical models can become a challenging task: the quality of experimental data has to be assessed objectively, unknown model parameters need to be estimated from the experimental data, and numerical calculations need to be precise and efficient. Here, we discuss, compare and characterize the performance of computational methods throughout the process of quantitative dynamic modeling using two previously established examples, for which quantitative, dose- and time-resolved experimental data are available. In particular, we present an approach that allows to determine the quality of experimental data in an efficient, objective and automated manner. Using this approach data generated by different measurement techniques and even in single replicates can be reliably used for mathematical modeling. For the estimation of unknown model parameters, the performance of different optimization algorithms was compared systematically. Our results show that deterministic derivative-based optimization employing the sensitivity equations in combination with a multi-start strategy based on latin hypercube sampling outperforms the other methods by orders of magnitude in accuracy and speed. Finally, we investigated transformations that yield a more efficient parameterization of the model and therefore lead to a further enhancement in optimization performance. We provide a freely available open source software package that implements the algorithms and examples compared here.

  1. Quantitative comparisons of analogue models of brittle wedge dynamics

    Science.gov (United States)

    Schreurs, Guido

    2010-05-01

    Analogue model experiments are widely used to gain insights into the evolution of geological structures. In this study, we present a direct comparison of experimental results of 14 analogue modelling laboratories using prescribed set-ups. A quantitative analysis of the results will document the variability among models and will allow an appraisal of reproducibility and limits of interpretation. This has direct implications for comparisons between structures in analogue models and natural field examples. All laboratories used the same frictional analogue materials (quartz and corundum sand) and prescribed model-building techniques (sieving and levelling). Although each laboratory used its own experimental apparatus, the same type of self-adhesive foil was used to cover the base and all the walls of the experimental apparatus in order to guarantee identical boundary conditions (i.e. identical shear stresses at the base and walls). Three experimental set-ups using only brittle frictional materials were examined. In each of the three set-ups the model was shortened by a vertical wall, which moved with respect to the fixed base and the three remaining sidewalls. The minimum width of the model (dimension parallel to mobile wall) was also prescribed. In the first experimental set-up, a quartz sand wedge with a surface slope of ˜20° was pushed by a mobile wall. All models conformed to the critical taper theory, maintained a stable surface slope and did not show internal deformation. In the next two experimental set-ups, a horizontal sand pack consisting of alternating quartz sand and corundum sand layers was shortened from one side by the mobile wall. In one of the set-ups a thin rigid sheet covered part of the model base and was attached to the mobile wall (i.e. a basal velocity discontinuity distant from the mobile wall). In the other set-up a basal rigid sheet was absent and the basal velocity discontinuity was located at the mobile wall. In both types of experiments

  2. [Study of infrared spectroscopy quantitative analysis method for methane gas based on data mining].

    Science.gov (United States)

    Zhang, Ai-Ju

    2013-10-01

    Monitoring of methane gas is one of the important factors affecting the coal mine safety. The online real-time monitoring of the methane gas is used for the mine safety protection. To improve the accuracy of model analysis, in the present paper, the author uses the technology of infrared spectroscopy to study the gas infrared quantitative analysis algorithm. By data mining technology application in multi-component infrared spectroscopy quantitative analysis algorithm, it was found that cluster analysis partial least squares algorithm is obviously superior to simply using partial least squares algorithm in terms of accuracy. In addition, to reduce the influence of the error on the accuracy of model individual calibration samples, the clustering analysis was used for the data preprocessing, and such denoising method was found to improve the analysis accuracy.

  3. Quantitative analysis of organic additive content in a polymer by ToF-SIMS with PCA

    Science.gov (United States)

    Ito, Hidemi; Kono, Teiichiro

    2008-12-01

    Quantitative analysis of organic materials by ToF-SIMS is intrinsically difficult because of their tendency to decompose under ion irradiation. In this study, we applied principal component analysis (PCA) as a means of compensation for the spectral degradation caused by this decomposition and thus improve the accuracy of the quantitative analysis, using as models two organic additives of quite different composition and vulnerability to decomposition under ion irradiation, in polystyrene. This enables the extraction of a principal component related to their content that is independent of the decomposition. The effectiveness of this approach in quantitative analysis of organic additives content in polymers without loss in accuracy due to spectral degradation will be discussed.

  4. Quantitative Analysis of Quality Assessment about Mathematical Modeling%数学建模素质评估的定量分析

    Institute of Scientific and Technical Information of China (English)

    王浩华; 罗婷

    2012-01-01

    针对海南大学学生参加数学建模竞赛的实际情况,应用层次分析法和动态规划的理论方法对数学建模队员的选拔问题进行了建模和分析.在兼顾公平选拔原则的基础上,对学生素质进行了综合评定,给出了最佳的分组原则.%Based on the actual situation of mathematical modeling contest of Hainan University, Analytic Hierarchy Process ( AHP) and the dynamic programming theory were used to construct the mathematical model for selecting the team member. Based on the principle of fairness, the students were evaluated and the best grouping principles were proposed.

  5. Quantitative nanoscale analysis in 3D using electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kuebel, Christian [Karlsruhe Institute of Technology, INT, 76344 Eggenstein-Leopoldshafen (Germany)

    2011-07-01

    State-of-the-art electron tomography has been established as a powerful tool to image complex structures with nanometer resolution in 3D. Especially STEM tomography is used extensively in materials science in such diverse areas as catalysis, semiconductor materials, and polymer composites mainly providing qualitative information on morphology, shape and distribution of materials. However, for an increasing number of studies quantitative information, e.g. surface area, fractal dimensions, particle distribution or porosity are needed. A quantitative analysis is typically performed after segmenting the tomographic data, which is one of the main sources of error for the quantification. In addition to noise, systematic errors due to the missing wedge and due to artifacts from the reconstruction algorithm itself are responsible for these segmentation errors and improved algorithms are needed. This presentation will provide an overview of the possibilities and limitations of quantitative nanoscale analysis by electron tomography. Using catalysts and nano composites as applications examples, intensities and intensity variations observed for the 3D volume reconstructed by WBP and SIRT will be quantitatively compared to alternative reconstruction algorithms; implications for quantification of electron (or X-ray) tomographic data will be discussed and illustrated for quantification of particle size distributions, particle correlations, surface area, and fractal dimensions in 3D.

  6. 基于近红外光谱法的多组分复杂溢油源定量建模分析%Quantitative Analysis Model of Multi-Component Complex Oil Spill Source Based on Near Infrared Spectroscopy

    Institute of Scientific and Technical Information of China (English)

    谈爱玲; 毕卫红

    2012-01-01

    Near infrared spectroscopy technology was used for quantitative analysis of the simulation of complex oil spill source. Three light petroleum products, i. e. gasoline, diesel fuel and kerosene oil, were selected and configured as simulated mixture of oil spill samples in accordance with different concentrations proportion, and their near infrared spectroscopy in the range of 8 000 —12 000 cm"1 was collected by Fourier transform near infrared spectrometer. After processing the NIR spectra with different pretreatment methods, partial least squares method was used to establish quantitative analysis model for the mixture of oil spill samples. For gasoline, diesel fuel and kerosene oil, the second derivative method is the optimal pretreatment method, and for these three oil components in the ranges of 8 501. 3—7 999. 8 and 6 102.1—4 597. 8 cm"1; 6 549. 5~4 597. 8;7 999. 8—7 498. 4 and 102.1—4 597. 8 cm"1, the correlation coefficients i?2 of the prediction model are 0. 998 2, 0. 990 2 and 0. 993 6 respectively, while the forecast RMSEP indicators are 0. 474 7, 0. 936 1 and 1. 013 1 respectively; The experimental results show that using near infrared spectroscopy can quantitatively determine the content of each component in the simulated mixed oil spill samples, thus this method can provide effective means for the quantitative detection and analysis of complex marine oil spill source.%研究利用近红外光谱分析方法进行模拟复杂混合溢油源的定量分析问题.选取汽油、柴油、煤油三种轻质石油类产品,按照不同浓度比例配置成40个模拟混合溢油样本,利用傅里叶变换近红外光谱仪采集其在4 000~12 000 cm-1谱区范围内的近红外光谱;采用不同预处理方法,利用偏最小二乘算法建立混合溢油样本三组分各自的浓度定量模型.汽油、柴油和煤油的最优预处理方法均为二阶导数方法,分别在8501.3~7 999.8 cm-1,6 102.1~4 597.8 cm-1,6 549.5~4 597.8 cm-1

  7. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  8. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  9. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  10. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  11. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu

    2014-01-01

    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  12. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  13. Particle concentration measurement of virus samples using electrospray differential mobility analysis and quantitative amino acid analysis.

    Science.gov (United States)

    Cole, Kenneth D; Pease, Leonard F; Tsai, De-Hao; Singh, Tania; Lute, Scott; Brorson, Kurt A; Wang, Lili

    2009-07-24

    Virus reference materials are needed to develop and calibrate detection devices and instruments. We used electrospray differential mobility analysis (ES-DMA) and quantitative amino acid analysis (AAA) to determine the particle concentration of three small model viruses (bacteriophages MS2, PP7, and phiX174). The biological activity, purity, and aggregation of the virus samples were measured using plaque assays, denaturing gel electrophoresis, and size-exclusion chromatography. ES-DMA was developed to count the virus particles using gold nanoparticles as internal standards. ES-DMA additionally provides quantitative measurement of the size and extent of aggregation in the virus samples. Quantitative AAA was also used to determine the mass of the viral proteins in the pure virus samples. The samples were hydrolyzed and the masses of the well-recovered amino acids were used to calculate the equivalent concentration of viral particles in the samples. The concentration of the virus samples determined by ES-DMA was in good agreement with the concentration predicted by AAA for these purified samples. The advantages and limitations of ES-DMA and AAA to characterize virus reference materials are discussed.

  14. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H;

    2016-01-01

    BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...... to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm...

  15. Quantitative and qualitative analysis and interpretation of CT perfusion imaging.

    Science.gov (United States)

    Valdiviezo, Carolina; Ambrose, Marietta; Mehra, Vishal; Lardo, Albert C; Lima, Joao A C; George, Richard T

    2010-12-01

    Coronary artery disease (CAD) remains the leading cause of death in the United States. Rest and stress myocardial perfusion imaging has an important role in the non-invasive risk stratification of patients with CAD. However, diagnostic accuracies have been limited, which has led to the development of several myocardial perfusion imaging techniques. Among them, myocardial computed tomography perfusion imaging (CTP) is especially interesting as it has the unique capability of providing anatomic- as well as coronary stenosis-related functional data when combined with computed tomography angiography (CTA). The primary aim of this article is to review the qualitative, semi-quantitative, and quantitative analysis approaches to CTP imaging. In doing so, we will describe the image data required for each analysis and discuss the advantages and disadvantages of each approach.

  16. Country Risk Analysis: A Survey of the Quantitative Methods

    OpenAIRE

    Hiranya K Nath

    2008-01-01

    With globalization and financial integration, there has been rapid growth of international lending and foreign direct investment (FDI). In view of this emerging trend, country risk analysis has become extremely important for the international creditors and investors. This paper briefly discusses the concepts and definitions, and presents a survey of the quantitative methods that are used to address various issues related to country risk. It also gives a summary review of selected empirical st...

  17. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott; Morris, Richard V.; Ehlmann, Bethany; Dyar, M. Darby

    2017-03-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the laser-induced breakdown spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element's emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple "sub-model" method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then "blending" these "sub-models" into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares (PLS) regression, is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  18. Quantitative modelling in cognitive ergonomics: predicting signals passed at danger

    OpenAIRE

    Moray, Neville; Groeger, John; Stanton, Neville

    2016-01-01

    This paper shows how to combine field observations, experimental data, and mathematical modeling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example we consider a major railway accident. In 1999 a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, "black box" data, and accident and engineering reports, to construct a case history of the accident. We show how t...

  19. The correlation of contrast-enhanced ultrasound and MRI perfusion quantitative analysis in rabbit VX2 liver cancer.

    Science.gov (United States)

    Xiang, Zhiming; Liang, Qianwen; Liang, Changhong; Zhong, Guimian

    2014-12-01

    Our objective is to explore the value of liver cancer contrast-enhanced ultrasound (CEUS) and MRI perfusion quantitative analysis in liver cancer and the correlation between these two analysis methods. Rabbit VX2 liver cancer model was established in this study. CEUS was applied. Sono Vue was applied in rabbits by ear vein to dynamically observe and record the blood perfusion and changes in the process of VX2 liver cancer and surrounding tissue. MRI perfusion quantitative analysis was used to analyze the mean enhancement time and change law of maximal slope increasing, which were further compared with the pathological examination results. Quantitative indicators of liver cancer CEUS and MRI perfusion quantitative analysis were compared, and the correlation between them was analyzed by correlation analysis. Rabbit VX2 liver cancer model was successfully established. CEUS showed that time-intensity curve of rabbit VX2 liver cancer showed "fast in, fast out" model while MRI perfusion quantitative analysis showed that quantitative parameter MTE of tumor tissue increased and MSI decreased: the difference was statistically significant (P 0.05). However, the quantitative parameter of them were significantly positively correlated (P liver cancer lesion and surrounding liver parenchyma, and the quantitative parameters of them are correlated. The combined application of both is of importance in early diagnosis of liver cancer.

  20. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    Science.gov (United States)

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  1. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    Science.gov (United States)

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  2. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies

    Directory of Open Access Journals (Sweden)

    Siu-Leung Chau

    2016-08-01

    Full Text Available Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ injection (SQI, via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS; saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC with evaporative light scattering detector (ELSD on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%–0.21%, and 53.49%–58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  3. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  4. Quantitative analysis for nonlinear fluorescent spectra based on edges matching

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A novel spectra-edge-matching approach is proposed for the quantitative analysis of the nonlinear fluorescence spectra of the air impurities excited by a femtosecond laser.The fluorescence spectra are first denoised and compressed,both by wavelet transform,and several peak groups are then picked from each spectrum according to a threshold of intensity and are used to extract the spectral features through principal component analysis.It is indicated that the first two principle components actually cover up to 98% of the total information and are sufficient for the final concentration analysis.The analysis reveals a monotone relationship between the spectra intensity and the concentration of the air impurities,suggesting that the femtosecond laser induced fluorescence spectroscopy along with the proposed spectra analysis method can become a powerful tool for monitoring environmental pollutants.

  5. Quantitative gait analysis following hemispherotomy for Rasmussen′s encephalitis

    Directory of Open Access Journals (Sweden)

    Santhosh George Thomas

    2007-01-01

    Full Text Available Peri-insular hemispherotomy is a form of disconnective hemispherectomy involving complete disconnection of all ascending / descending and commisural connections of one hemisphere. We report a case of a seven and a half year old child with intractable epilepsy due to Rasmussen′s encephalitis who underwent peri-insular hemispherotomy and achieved complete freedom from seizures. Quantitative gait analysis was used to describe the changes in the kinematic and kinetic parameters of gait with surface electromyographs 18 months after surgery. The focus of this paper is to highlight the utility of gait analysis following hemispherotomy with a view to directing postsurgical motor training and rehabilitation.

  6. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  7. Quantitative magnetospheric models derived from spacecraft magnetometer data

    Science.gov (United States)

    Mead, G. D.; Fairfield, D. H.

    1973-01-01

    Quantitative models of the external magnetospheric field were derived by making least-squares fits to magnetic field measurements from four IMP satellites. The data were fit to a power series expansion in the solar magnetic coordinates and the solar wind-dipole tilt angle, and thus the models contain the effects of seasonal north-south asymmetries. The expansions are divergence-free, but unlike the usual scalar potential expansions, the models contain a nonzero curl representing currents distributed within the magnetosphere. Characteristics of four models are presented, representing different degrees of magnetic disturbance as determined by the range of Kp values. The latitude at the earth separating open polar cap field lines from field lines closing on the dayside is about 5 deg lower than that determined by previous theoretically-derived models. At times of high Kp, additional high latitude field lines are drawn back into the tail.

  8. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    Science.gov (United States)

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  9. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  10. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  11. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    Science.gov (United States)

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  12. Analysis of quantitative pore features based on mathematical morphology

    Institute of Scientific and Technical Information of China (English)

    QI Heng-nian; CHEN Feng-nong; WANG Hang-jun

    2008-01-01

    Wood identification is a basic technique of wood science and industry. Pore features are among the most important identification features for hardwoods. We have used a method based on an analysis of quantitative pore feature, which differs from traditional qualitative methods. We applies mathematical morphology methods such as dilation and erosion, open and close transformation of wood cross-sections, image repairing, noise filtering and edge detection to segment the pores from their background. Then the mean square errors (MSE) of pores were computed to describe the distribution of pores. Our experiment shows that it is easy to classift the pore features into three basic types, just as in traditional qualitative methods, but with the use of MSE of pores. This quantitative method improves wood identification considerably.

  13. Quantitative data analysis methods for bead-based DNA hybridization assays using generic flow cytometry platforms.

    Science.gov (United States)

    Corrie, S R; Lawrie, G A; Battersby, B J; Ford, K; Rühmann, A; Koehler, K; Sabath, D E; Trau, M

    2008-05-01

    Bead-based assays are in demand for rapid genomic and proteomic assays for both research and clinical purposes. Standard quantitative procedures addressing raw data quality and analysis are required to ensure the data are consistent and reproducible across laboratories independent of flow platform. Quantitative procedures have been introduced spanning raw histogram analysis through to absolute target quantitation. These included models developed to estimate the absolute number of sample molecules bound per bead (Langmuir isotherm), relative quantitative comparisons (two-sided t-tests), and statistical analyses investigating the quality of raw fluorescence data. The absolute target quantitation method revealed a concentration range (below probe saturation) of Cy5-labeled synthetic cytokeratin 19 (K19) RNA of c.a. 1 x 10(4) to 500 x 10(4) molecules/bead, with a binding constant of c.a. 1.6 nM. Raw hybridization frequency histograms were observed to be highly reproducible across 10 triplex assay replicates and only three assay replicates were required to distinguish overlapping peaks representing small sequence mismatches. This study provides a quantitative scheme for determining the absolute target concentration in nucleic acid hybridization reactions and the equilibrium binding constants for individual probe/target pairs. It is envisaged that such studies will form the basis of standard analytical procedures for bead-based cytometry assays to ensure reproducibility in inter- and intra-platform comparisons of data between laboratories. (c) 2008 International Society for Advancement of Cytometry.

  14. A novel rapid quantitative analysis of drug migration on tablets using laser induced breakdown spectroscopy.

    Science.gov (United States)

    Yokoyama, Makoto; Tourigny, Martine; Moroshima, Kenji; Suzuki, Junsuke; Sakai, Miyako; Iwamoto, Kiyoshi; Takeuchi, Hirofumi

    2010-11-01

    There have been few reports wherein drug migration from the interior to the surface of a tablet has been analyzed quantitatively until now. In this paper, we propose a novel, rapid, quantitative analysis of drug migration in tablets using laser induced breakdown spectroscopy (LIBS). To evaluate drug migration, model tablets containing nicardipine hydrochloride as active pharmaceutical ingredient (API) were prepared by a conventional wet granulation method. Since the color of this API is pale yellow and all excipients are white, we can observe the degree of drug migration by visual inspection in these model tablets. In order to prepare tablets with different degrees of drug migration, the temperature of the drying process after tableting was varied between 50 to 80 °C. Using these manifold tablets, visual inspection, Fourier transform (FT)-IR mapping and LIBS analysis were carried out to evaluate the drug migration in the tablets. While drug migration could be observed using all methods, only LIBS analysis could provide quantitative analysis wherein the average LIBS intensity was correlated with the degree of drug migration obtained from the drying temperature. Moreover, in this work, we compared the sample preparation, data analysis process and measurement time for visual inspection, FT-IR mapping and LIBS analysis. The results of the comparison between these methods demonstrated that LIBS analysis is the simplest and the fastest method for migration monitoring. From the results obtained, we conclude that LIBS analysis is one of most useful process analytical technology (PAT) tools to solve the universal migration problem.

  15. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  16. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  17. Quantitative analysis of in vivo confocal microscopy images: a review.

    Science.gov (United States)

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  18. Quantitative assessment of p-glycoprotein expression and function using confocal image analysis.

    Science.gov (United States)

    Hamrang, Zahra; Arthanari, Yamini; Clarke, David; Pluen, Alain

    2014-10-01

    P-glycoprotein is implicated in clinical drug resistance; thus, rapid quantitative analysis of its expression and activity is of paramout importance to the design and success of novel therapeutics. The scope for the application of quantitative imaging and image analysis tools in this field is reported here at "proof of concept" level. P-glycoprotein expression was utilized as a model for quantitative immunofluorescence and subsequent spatial intensity distribution analysis (SpIDA). Following expression studies, p-glycoprotein inhibition as a function of verapamil concentration was assessed in two cell lines using live cell imaging of intracellular Calcein retention and a routine monolayer fluorescence assay. Intercellular and sub-cellular distributions in the expression of the p-glycoprotein transporter between parent and MDR1-transfected Madin-Derby Canine Kidney cell lines were examined. We have demonstrated that quantitative imaging can provide dose-response parameters while permitting direct microscopic analysis of intracellular fluorophore distributions in live and fixed samples. Analysis with SpIDA offers the ability to detect heterogeniety in the distribution of labeled species, and in conjunction with live cell imaging and immunofluorescence staining may be applied to the determination of pharmacological parameters or analysis of biopsies providing a rapid prognostic tool.

  19. Asynchronous adaptive time step in quantitative cellular automata modeling

    Directory of Open Access Journals (Sweden)

    Sun Yan

    2004-06-01

    Full Text Available Abstract Background The behaviors of cells in metazoans are context dependent, thus large-scale multi-cellular modeling is often necessary, for which cellular automata are natural candidates. Two related issues are involved in cellular automata based multi-cellular modeling: how to introduce differential equation based quantitative computing to precisely describe cellular activity, and upon it, how to solve the heavy time consumption issue in simulation. Results Based on a modified, language based cellular automata system we extended that allows ordinary differential equations in models, we introduce a method implementing asynchronous adaptive time step in simulation that can considerably improve efficiency yet without a significant sacrifice of accuracy. An average speedup rate of 4–5 is achieved in the given example. Conclusions Strategies for reducing time consumption in simulation are indispensable for large-scale, quantitative multi-cellular models, because even a small 100 × 100 × 100 tissue slab contains one million cells. Distributed and adaptive time step is a practical solution in cellular automata environment.

  20. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  1. Quantitative phosphoproteomic analysis using iTRAQ method.

    Science.gov (United States)

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  2. A quantitative analysis of IRAS maps of molecular clouds

    Science.gov (United States)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  3. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  4. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  5. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  6. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  7. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    OpenAIRE

    A. O. Domanov

    2014-01-01

    Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg) explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial chara...

  8. Quantitative analysis of sideband coupling in photoinduced force microscopy

    Science.gov (United States)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  9. Quantitative and comparative analysis of hyperspectral data fusion performance

    Institute of Scientific and Technical Information of China (English)

    王强; 张晔; 李硕; 沈毅

    2002-01-01

    Hyperspectral data fusion technique is the key to hyperspectral data processing in recent years. Manyfusion methods have been proposed, but little research has been done to evaluate the performances of differentdata fusion methods. In order to meet the urgent need, quantitative correlation analysis (QCA) is proposed toanalyse and compare the performances of different fusion methods directly from data before and after fusion. Ex-periment results show that the new method is effective and the results of comparison are in agreement with theresults of application.

  10. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  11. anNET: a tool for network-embedded thermodynamic analysis of quantitative metabolome data

    Directory of Open Access Journals (Sweden)

    Zamboni Nicola

    2008-04-01

    Full Text Available Abstract Background Compared to other omics techniques, quantitative metabolomics is still at its infancy. Complex sample preparation and analytical procedures render exact quantification extremely difficult. Furthermore, not only the actual measurement but also the subsequent interpretation of quantitative metabolome data to obtain mechanistic insights is still lacking behind the current expectations. Recently, the method of network-embedded thermodynamic (NET analysis was introduced to address some of these open issues. Building upon principles of thermodynamics, this method allows for a quality check of measured metabolite concentrations and enables to spot metabolic reactions where active regulation potentially controls metabolic flux. So far, however, widespread application of NET analysis in metabolomics labs was hindered by the absence of suitable software. Results We have developed in Matlab a generalized software called 'anNET' that affords a user-friendly implementation of the NET analysis algorithm. anNET supports the analysis of any metabolic network for which a stoichiometric model can be compiled. The model size can span from a single reaction to a complete genome-wide network reconstruction including compartments. anNET can (i test quantitative data sets for thermodynamic consistency, (ii predict metabolite concentrations beyond the actually measured data, (iii identify putative sites of active regulation in the metabolic reaction network, and (iv help in localizing errors in data sets that were found to be thermodynamically infeasible. We demonstrate the application of anNET with three published Escherichia coli metabolome data sets. Conclusion Our user-friendly and generalized implementation of the NET analysis method in the software anNET allows users to rapidly integrate quantitative metabolome data obtained from virtually any organism. We envision that use of anNET in labs working on quantitative metabolomics will provide the

  12. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    Science.gov (United States)

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  13. Quantitative modeling of a gene's expression from its intergenic sequence.

    Directory of Open Access Journals (Sweden)

    Md Abul Hassan Samee

    2014-03-01

    Full Text Available Modeling a gene's expression from its intergenic locus and trans-regulatory context is a fundamental goal in computational biology. Owing to the distributed nature of cis-regulatory information and the poorly understood mechanisms that integrate such information, gene locus modeling is a more challenging task than modeling individual enhancers. Here we report the first quantitative model of a gene's expression pattern as a function of its locus. We model the expression readout of a locus in two tiers: 1 combinatorial regulation by transcription factors bound to each enhancer is predicted by a thermodynamics-based model and 2 independent contributions from multiple enhancers are linearly combined to fit the gene expression pattern. The model does not require any prior knowledge about enhancers contributing toward a gene's expression. We demonstrate that the model captures the complex multi-domain expression patterns of anterior-posterior patterning genes in the early Drosophila embryo. Altogether, we model the expression patterns of 27 genes; these include several gap genes, pair-rule genes, and anterior, posterior, trunk, and terminal genes. We find that the model-selected enhancers for each gene overlap strongly with its experimentally characterized enhancers. Our findings also suggest the presence of sequence-segments in the locus that would contribute ectopic expression patterns and hence were "shut down" by the model. We applied our model to identify the transcription factors responsible for forming the stripe boundaries of the studied genes. The resulting network of regulatory interactions exhibits a high level of agreement with known regulatory influences on the target genes. Finally, we analyzed whether and why our assumption of enhancer independence was necessary for the genes we studied. We found a deterioration of expression when binding sites in one enhancer were allowed to influence the readout of another enhancer. Thus, interference

  14. Quantitative characterization of surface topography using spectral analysis

    Science.gov (United States)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  15. Models of Economic Analysis

    OpenAIRE

    Adrian Ioana; Tiberiu Socaciu

    2013-01-01

    The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...

  16. Preoperative Prediction of Microvascular Invasion in Hepatocellular Carcinoma using Quantitative Image Analysis.

    Science.gov (United States)

    Zheng, Jian; Chakraborty, Jayasree; Chapman, William C; Gerst, Scott; Gonen, Mithat; Pak, Linda M; Jarnagin, William R; DeMatteo, Ronald P; Do, Richard Kg; Simpson, Amber L; Allen, Peter J; Balachandran, Vinod P; D'Angelica, Michael I; Kingham, T Peter; Vachharajani, Neeta

    2017-09-20

    Microvascular invasion (MVI) is a significant risk factor for early recurrence after resection or transplantation for hepatocellular carcinoma (HCC). Knowledge of MVI status would help guide treatment recommendations but is generally identified after surgery. This study aims to predict MVI preoperatively using quantitative image analysis. From 2 institutions, 120 patients submitted to resection of HCC from 2003 to 2015 were included. The largest tumor from preoperative CT was subjected to quantitative image analysis, which uses an automated computer algorithm to capture regional variation in CT enhancement patterns. Quantitative imaging features by automatic analysis, qualitative radiographic descriptors by 2 radiologists, and preoperative clinical variables were included in multivariate analysis to predict histologic MVI. Histologic MVI was identified in 19 (37%) patients with tumors ≤ 5 cm and 34 (49%) patients with tumors > 5 cm. Among patients with ≤ 5 cm tumors, none of clinical findings or radiographic descriptors was associated with MVI; however, quantitative feature based on angle co-occurrence matrix predicted MVI with area under curve (AUC) 0.80, positive predictive value (PPV) 63% and negative predictive value (NPV) 85%. In patients with > 5 cm tumors, higher α-fetoprotein (AFP) level, larger tumor size, and viral hepatitis history were associated with MVI, whereas radiographic descriptors did not. However, a multivariate model combining AFP, tumor size, hepatitis status, and quantitative feature based on local binary pattern predicted MVI with AUC 0.88, PPV 72% and NPV 96%. This study reveals the potential importance of quantitative image analysis as a predictor of MVI. Copyright © 2017. Published by Elsevier Inc.

  17. Quantitative chemical analysis of ocular melanosomes in the TEM.

    Science.gov (United States)

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  18. Quantitative T2 combined with texture analysis of nuclear magnetic resonance images identify different degrees of muscle involvement in three mouse models of muscle dystrophy: mdx, Largemyd and mdx/Largemyd.

    Directory of Open Access Journals (Sweden)

    Aurea B Martins-Bach

    Full Text Available Quantitative nuclear magnetic resonance imaging (MRI has been considered a promising non-invasive tool for monitoring therapeutic essays in small size mouse models of muscular dystrophies. Here, we combined MRI (anatomical images and transverse relaxation time constant-T2-measurements to texture analyses in the study of four mouse strains covering a wide range of dystrophic phenotypes. Two still unexplored mouse models of muscular dystrophies were analyzed: The severely affected Largemyd mouse and the recently generated and worst double mutant mdx/Largemyd mouse, as compared to the mildly affected mdx and normal mice. The results were compared to histopathological findings. MRI showed increased intermuscular fat and higher muscle T2 in the three dystrophic mouse models when compared to the wild-type mice (T2: mdx/Largemyd: 37.6±2.8 ms; mdx: 35.2±4.5 ms; Largemyd: 36.6±4.0 ms; wild-type: 29.1±1.8 ms, p<0.05, in addition to higher muscle T2 in the mdx/Largemyd mice when compared to mdx (p<0.05. The areas with increased muscle T2 in the MRI correlated spatially with the identified histopathological alterations such as necrosis, inflammation, degeneration and regeneration foci. Nevertheless, muscle T2 values were not correlated with the severity of the phenotype in the 3 dystrophic mouse strains, since the severely affected Largemyd showed similar values than both the mild mdx and worst mdx/Largemyd lineages. On the other hand, all studied mouse strains could be unambiguously identified with texture analysis, which reflected the observed differences in the distribution of signals in muscle MRI. Thus, combined T2 intensity maps and texture analysis is a powerful approach for the characterization and differentiation of dystrophic muscles with diverse genotypes and phenotypes. These new findings provide important noninvasive tools in the evaluation of the efficacy of new therapies, and most importantly, can be directly applied in human

  19. Quantitative T2 Combined with Texture Analysis of Nuclear Magnetic Resonance Images Identify Different Degrees of Muscle Involvement in Three Mouse Models of Muscle Dystrophy: mdx, Largemyd and mdx/Largemyd

    Science.gov (United States)

    Martins-Bach, Aurea B.; Malheiros, Jackeline; Matot, Béatrice; Martins, Poliana C. M.; Almeida, Camila F.; Caldeira, Waldir; Ribeiro, Alberto F.; Loureiro de Sousa, Paulo; Azzabou, Noura; Tannús, Alberto; Carlier, Pierre G.; Vainzof, Mariz

    2015-01-01

    Quantitative nuclear magnetic resonance imaging (MRI) has been considered a promising non-invasive tool for monitoring therapeutic essays in small size mouse models of muscular dystrophies. Here, we combined MRI (anatomical images and transverse relaxation time constant—T2—measurements) to texture analyses in the study of four mouse strains covering a wide range of dystrophic phenotypes. Two still unexplored mouse models of muscular dystrophies were analyzed: The severely affected Largemyd mouse and the recently generated and worst double mutant mdx/Largemyd mouse, as compared to the mildly affected mdx and normal mice. The results were compared to histopathological findings. MRI showed increased intermuscular fat and higher muscle T2 in the three dystrophic mouse models when compared to the wild-type mice (T2: mdx/Largemyd: 37.6±2.8 ms; mdx: 35.2±4.5 ms; Largemyd: 36.6±4.0 ms; wild-type: 29.1±1.8 ms, p<0.05), in addition to higher muscle T2 in the mdx/Largemyd mice when compared to mdx (p<0.05). The areas with increased muscle T2 in the MRI correlated spatially with the identified histopathological alterations such as necrosis, inflammation, degeneration and regeneration foci. Nevertheless, muscle T2 values were not correlated with the severity of the phenotype in the 3 dystrophic mouse strains, since the severely affected Largemyd showed similar values than both the mild mdx and worst mdx/Largemyd lineages. On the other hand, all studied mouse strains could be unambiguously identified with texture analysis, which reflected the observed differences in the distribution of signals in muscle MRI. Thus, combined T2 intensity maps and texture analysis is a powerful approach for the characterization and differentiation of dystrophic muscles with diverse genotypes and phenotypes. These new findings provide important noninvasive tools in the evaluation of the efficacy of new therapies, and most importantly, can be directly applied in human translational research

  20. Modeling Error in Quantitative Macro-Comparative Research

    Directory of Open Access Journals (Sweden)

    Salvatore J. Babones

    2015-08-01

    Full Text Available Much quantitative macro-comparative research (QMCR relies on a common set of published data sources to answer similar research questions using a limited number of statistical tools. Since all researchers have access to much the same data, one might expect quick convergence of opinion on most topics. In reality, of course, differences of opinion abound and persist. Many of these differences can be traced, implicitly or explicitly, to the different ways researchers choose to model error in their analyses. Much careful attention has been paid in the political science literature to the error structures characteristic of time series cross-sectional (TSCE data, but much less attention has been paid to the modeling of error in broadly cross-national research involving large panels of countries observed at limited numbers of time points. Here, and especially in the sociology literature, multilevel modeling has become a hegemonic – but often poorly understood – research tool. I argue that widely-used types of multilevel models, commonly known as fixed effects models (FEMs and random effects models (REMs, can produce wildly spurious results when applied to trended data due to mis-specification of error. I suggest that in most commonly-encountered scenarios, difference models are more appropriate for use in QMC.

  1. A quantitative model for integrating landscape evolution and soil formation

    Science.gov (United States)

    Vanwalleghem, T.; Stockmann, U.; Minasny, B.; McBratney, Alex B.

    2013-06-01

    evolution is closely related to soil formation. Quantitative modeling of the dynamics of soils and landscapes should therefore be integrated. This paper presents a model, named Model for Integrated Landscape Evolution and Soil Development (MILESD), which describes the interaction between pedogenetic and geomorphic processes. This mechanistic model includes the most significant soil formation processes, ranging from weathering to clay translocation, and combines these with the lateral redistribution of soil particles through erosion and deposition. The model is spatially explicit and simulates the vertical variation in soil horizon depth as well as basic soil properties such as texture and organic matter content. In addition, sediment export and its properties are recorded. This model is applied to a 6.25 km2 area in the Werrikimbe National Park, Australia, simulating soil development over a period of 60,000 years. Comparison with field observations shows how the model accurately predicts trends in total soil thickness along a catena. Soil texture and bulk density are predicted reasonably well, with errors of the order of 10%, however, field observations show a much higher organic carbon content than predicted. At the landscape scale, different scenarios with varying erosion intensity result only in small changes of landscape-averaged soil thickness, while the response of the total organic carbon stored in the system is higher. Rates of sediment export show a highly nonlinear response to soil development stage and the presence of a threshold, corresponding to the depletion of the soil reservoir, beyond which sediment export drops significantly.

  2. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  3. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  4. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    Science.gov (United States)

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  5. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  6. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  7. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    Science.gov (United States)

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  8. Qualitative versus quantitative radiographic analysis of foot deformities in children with hemiplegic cerebral palsy.

    Science.gov (United States)

    Westberry, David E; Davids, Jon R; Roush, Thomas F; Pugh, Linda I

    2008-01-01

    Qualitative assessments of standing plain radiographs are frequently used to determine treatment strategies and assess outcomes for the management of a wide range of foot and ankle conditions in children. A quantitative technique for such analyses would presumably be more precise and reliable. The goal of this study was to compare qualitative and quantitative techniques for the assessment of plain radiographs of the foot and ankle in children with hemiplegic type cerebral palsy (CP). Standing anteroposterior and lateral radiographs of the foot and ankle of the involved side for 49 children with hemiplegic CP were analyzed qualitatively by 2 pediatric orthopaedists, based upon a 3-segment (hindfoot, midfoot, and forefoot) foot model. Quantitative assessment of the same radiographs was performed by 2 examiners, using 6 radiographic measurements developed to describe the alignment of the foot based upon the same 3-segment model. Intraobserver and interobserver reliability was determined for both the qualitative and the quantitative techniques. The qualitative and quantitative techniques were compared to determine agreement. The qualitative technique demonstrated poor-to-fair interobserver reliability (percent agreement range, 23%-31%; weighted kappa range, 0.291-0.568). The quantitative technique demonstrated good-to-excellent intraobserver (correlation coefficient range, 0.81-0.99) and interobserver (correlation coefficient range, 0.81-0.97) reliability. Percent agreement between the quantitative and the qualitative techniques for the assessment of foot segmental alignment for each examiner ranged from 22.2% to 100% (mean agreement for examiner 1 was 51% [correlation coefficient range, 0.04-0.48]; mean agreement for examiner 2 was 65.3% [correlation coefficient range, 0.22-0.85]). Percent agreement between the quantitative technique and both observers ranged from 11.1% to 83.3% (mean agreement was 36.7% [correlation coefficient range, 0.17-0.94]). Reliable

  9. Quantitative Analysis of Moisture Effect on Black Soil Reflectance

    Institute of Scientific and Technical Information of China (English)

    LIU Huan-Jun; ZHANG Yuan-Zhi; ZHANG Xin-Le; ZHANG Bai; SONG Kai-Shan; WANG Zong-Ming; TANG Na

    2009-01-01

    Several studies have demonstrated that soil reflectance decreases with increasing soil moisture content,or increases when the soil moisture reaches a certain content;however,there are few analyses on the quantitative relationship between soil reflectance and its moisture,especially in the case of black soils in northeast China.A new moisture adjusting method was developed to obtain soil reflectance with a smaller moisture interval to describe the quantitative relationship between soil reflectance and moisture.For the soil samples with moisture contents ranging from air-dry to saturated,the changes in soil reflectance with soil moisture can be depicted using a cubic equation.Both moisture threshold (MT) and moisture inflexion (MI) of soil reflectance can also be determined by the equation.When the moisture range was smaller than MT,soil reflectance can be simulated with a linear model.However,for samples with different soil organic matter (OM),the parameters of the linear model varied regularly with the OM content.Based on their relationship,the soil moisture can be estimated from soil reflectance in the black soil region.

  10. Issues in qualitative and quantitative risk analysis for developmental toxicology.

    Science.gov (United States)

    Kimmel, C A; Gaylor, D W

    1988-03-01

    The qualitative and quantitative evaluation of risk in developmental toxicology has been discussed in several recent publications. A number of issues still are to be resolved in this area. The qualitative evaluation and interpretation of end points in developmental toxicology depends on an understanding of the biological events leading to the end points observed, the relationships among end points, and their relationship to dose and to maternal toxicity. The interpretation of these end points is also affected by the statistical power of the experiments used for detecting the various end points observed. The quantitative risk assessment attempts to estimate human risk for developmental toxicity as a function of dose. The current approach is to apply safety (uncertainty) factors to the no observed effect level (NOEL). An alternative presented and discussed here is to model the experimental data and apply a safety factor to an estimated risk level to achieve an "acceptable" level of risk. In cases where the dose-response curves upward, this approach provides a conservative estimate of risk. This procedure does not preclude the existence of a threshold dose. More research is needed to develop appropriate dose-response models that can provide better estimates for low-dose extrapolation of developmental effects.

  11. Quantitative Raman spectroscopy for the analysis of carrot bioactives.

    Science.gov (United States)

    Killeen, Daniel P; Sansom, Catherine E; Lill, Ross E; Eason, Jocelyn R; Gordon, Keith C; Perry, Nigel B

    2013-03-20

    Rapid quantitative near-infrared Fourier transform Raman analyses of the key phytonutrients in carrots, polyacetylenes and carotenoids, are reported here for the first time. Solvent extracts of 31 carrot lines were analyzed for these phytonutrients by conventional methods, polyacetylenes by GC-FID and carotenoids by visible spectrophotometry. Carotenoid concentrations were 0-5586 μg g(-1) dry weight (DW). Polyacetylene concentrations were 74-4846 μg g(-1) DW, highest in wild carrots. The polyacetylenes were falcarinol, 6-1237 μg g(-1) DW; falcarindiol, 42-3475 μg g(-1) DW; and falcarindiol 3-acetate, 27-649 μg g(-1) DW. Strong Raman bands for carotenoids gave good correlation to results by visible spectrophotometry. A chemometric model capable of quantitating carotenoids from Raman data was developed. A classification model for rapidly distinguishing carrots with high and low polyacetylene (limit of detection = 1400 μg g(-1)) concentrations based on Raman spectral intensity in the region of 2250 cm(-1) was produced.

  12. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  13. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  14. Quantitative model studies for interfaces in organic electronic devices

    Science.gov (United States)

    Gottfried, J. Michael

    2016-11-01

    In organic light-emitting diodes and similar devices, organic semiconductors are typically contacted by metal electrodes. Because the resulting metal/organic interfaces have a large impact on the performance of these devices, their quantitative understanding is indispensable for the further rational development of organic electronics. A study by Kröger et al (2016 New J. Phys. 18 113022) of an important single-crystal based model interface provides detailed insight into its geometric and electronic structure and delivers valuable benchmark data for computational studies. In view of the differences between typical surface-science model systems and real devices, a ‘materials gap’ is identified that needs to be addressed by future research to make the knowledge obtained from fundamental studies even more beneficial for real-world applications.

  15. Quantitative identification of technological discontinuities using simulation modeling

    CERN Document Server

    Park, Hyunseok

    2016-01-01

    The aim of this paper is to develop and test metrics to quantitatively identify technological discontinuities in a knowledge network. We developed five metrics based on innovation theories and tested the metrics by a simulation model-based knowledge network and hypothetically designed discontinuity. The designed discontinuity is modeled as a node which combines two different knowledge streams and whose knowledge is dominantly persistent in the knowledge network. The performances of the proposed metrics were evaluated by how well the metrics can distinguish the designed discontinuity from other nodes on the knowledge network. The simulation results show that the persistence times # of converging main paths provides the best performance in identifying the designed discontinuity: the designed discontinuity was identified as one of the top 3 patents with 96~99% probability by Metric 5 and it is, according to the size of a domain, 12~34% better than the performance of the second best metric. Beyond the simulation ...

  16. A Quantitative Theory Model of a Photobleaching Mechanism

    Institute of Scientific and Technical Information of China (English)

    陈同生; 曾绍群; 周炜; 骆清铭

    2003-01-01

    A photobleaching model:D-P(dye-photon interaction)and D-O(Dye-oxygen oxidative reaction)photobleaching theory,is proposed.The quantitative power dependences of photobleaching rates with both one-and two-photon excitations(1 PE and TPE)are obtained.This photobleaching model can be used to elucidate our and other experimental results commendably.Experimental studies of the photobleaching rates for rhodamine B with TPE under unsaturation conditions reveals that the power dependences of photobleaching rates increase with the increasing dye concentration,and that the photobleaching rate of a single molecule increases in the second power of the excitation intensity,which is different from the high-order(> 3)nonlinear dependence of ensemble molecules.

  17. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    Science.gov (United States)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  18. Quantitative Analysis of the Nanopore Translocation Dynamics of Simple Structured Polynucleotides

    Science.gov (United States)

    Schink, Severin; Renner, Stephan; Alim, Karen; Arnaut, Vera; Simmel, Friedrich C.; Gerland, Ulrich

    2012-01-01

    Nanopore translocation experiments are increasingly applied to probe the secondary structures of RNA and DNA molecules. Here, we report two vital steps toward establishing nanopore translocation as a tool for the systematic and quantitative analysis of polynucleotide folding: 1), Using α-hemolysin pores and a diverse set of different DNA hairpins, we demonstrate that backward nanopore force spectroscopy is particularly well suited for quantitative analysis. In contrast to forward translocation from the vestibule side of the pore, backward translocation times do not appear to be significantly affected by pore-DNA interactions. 2), We develop and verify experimentally a versatile mesoscopic theoretical framework for the quantitative analysis of translocation experiments with structured polynucleotides. The underlying model is based on sequence-dependent free energy landscapes constructed using the known thermodynamic parameters for polynucleotide basepairing. This approach limits the adjustable parameters to a small set of sequence-independent parameters. After parameter calibration, the theoretical model predicts the translocation dynamics of new sequences. These predictions can be leveraged to generate a baseline expectation even for more complicated structures where the assumptions underlying the one-dimensional free energy landscape may no longer be satisfied. Taken together, backward translocation through α-hemolysin pores combined with mesoscopic theoretical modeling is a promising approach for label-free single-molecule analysis of DNA and RNA folding. PMID:22225801

  19. Multiparent intercross populations in analysis of quantitative traits

    Indian Academy of Sciences (India)

    Sujay Rakshit; Arunita Rakshit; J. V. Patil

    2011-04-01

    Most traits of interest to medical, agricultural and animal scientists show continuous variation and complex mode of inheritance. DNA-based markers are being deployed to analyse such complex traits, that are known as quantitative trait loci (QTL). In conventional QTL analysis, F2, backcross populations, recombinant inbred lines, backcross inbred lines and double haploids from biparental crosses are commonly used. Introgression lines and near isogenic lines are also being used for QTL analysis. However, such populations have major limitations like predominantly relying on the recombination events taking place in the F1 generation and mapping of only the allelic pairs present in the two parents. The second generation mapping resources like association mapping, nested association mapping and multiparent intercross populations potentially address the major limitations of available mapping resources. The potential of multiparent intercross populations in gene mapping has been discussed here. In such populations both linkage and association analysis can be conductted without encountering the limitations of structured populations. In such populations, larger genetic variation in the germplasm is accessed and various allelic and cytoplasmic interactions are assessed. For all practical purposes, across crop species, use of eight founders and a fixed population of 1000 individuals are most appropriate. Limitations with multiparent intercross populations are that they require longer time and more resource to be generated and they are likely to show extensive segregation for developmental traits, limiting their use in the analysis of complex traits. However, multiparent intercross population resources are likely to bring a paradigm shift towards QTL analysis in plant species.

  20. Phenotypic analysis of Arabidopsis mutants: quantitative analysis of root growth.

    Science.gov (United States)

    Doerner, Peter

    2008-03-01

    INTRODUCTIONThe growth of plant roots is very easy to measure and is particularly straightforward in Arabidopsis thaliana, because the increase in organ size is essentially restricted to one dimension. The precise measurement of root apical growth can be used to accurately determine growth activity (the rate of growth at a given time) during development in mutants, transgenic backgrounds, or in response to experimental treatments. Root growth is measured in a number of ways, the simplest of which is to grow the seedlings in a Petri dish and record the position of the advancing root tip at appropriate time points. The increase in root length is measured with a ruler and the data are entered into Microsoft Excel for analysis. When dealing with large numbers of seedlings, however, this procedure can be tedious, as well as inaccurate. An alternative approach, described in this protocol, uses "snapshots" of the growing plants, which are taken using gel-documentation equipment (i.e., a video camera with a frame-grabber unit, now commonly used to capture images from ethidium-bromide-stained electrophoresis gels). The images are analyzed using publicly available software (NIH-Image), which allows the user simply to cut and paste data into Microsoft Excel.

  1. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  2. Modeling the Earth's radiation belts. A review of quantitative data based electron and proton models

    Science.gov (United States)

    Vette, J. I.; Teague, M. J.; Sawyer, D. M.; Chan, K. W.

    1979-01-01

    The evolution of quantitative models of the trapped radiation belts is traced to show how the knowledge of the various features has developed, or been clarified, by performing the required analysis and synthesis. The Starfish electron injection introduced problems in the time behavior of the inner zone, but this residue decayed away, and a good model of this depletion now exists. The outer zone electrons were handled statistically by a log normal distribution such that above 5 Earth radii there are no long term changes over the solar cycle. The transition region between the two zones presents the most difficulty, therefore the behavior of individual substorms as well as long term changes must be studied. The latest corrections to the electron environment based on new data are outlined. The proton models have evolved to the point where the solar cycle effect at low altitudes is included. Trends for new models are discussed; the feasibility of predicting substorm injections and solar wind high-speed streams make the modeling of individual events a topical activity.

  3. Modeling the Earth's radiation belts. A review of quantitative data based electron and proton models

    Science.gov (United States)

    Vette, J. I.; Teague, M. J.; Sawyer, D. M.; Chan, K. W.

    1979-01-01

    The evolution of quantitative models of the trapped radiation belts is traced to show how the knowledge of the various features has developed, or been clarified, by performing the required analysis and synthesis. The Starfish electron injection introduced problems in the time behavior of the inner zone, but this residue decayed away, and a good model of this depletion now exists. The outer zone electrons were handled statistically by a log normal distribution such that above 5 Earth radii there are no long term changes over the solar cycle. The transition region between the two zones presents the most difficulty, therefore the behavior of individual substorms as well as long term changes must be studied. The latest corrections to the electron environment based on new data are outlined. The proton models have evolved to the point where the solar cycle effect at low altitudes is included. Trends for new models are discussed; the feasibility of predicting substorm injections and solar wind high-speed streams make the modeling of individual events a topical activity.

  4. An integrated qualitative and quantitative modeling framework for computer‐assisted HAZOP studies

    DEFF Research Database (Denmark)

    Wu, Jing; Zhang, Laibin; Hu, Jinqiu

    2014-01-01

    and validated on a case study concerning a three‐phase separation process. The multilevel flow modeling (MFM) methodology is used to represent the plant goals and functions. First, means‐end analysis is used to identify and formulate the intention of the process design in terms of components, functions...... safety critical operations, its causes and consequences. The outcome is a qualitative hazard analysis of selected process deviations from normal operations and their consequences as input to a traditional HAZOP table. The list of unacceptable high risk deviations identified by the qualitative HAZOP...... analysis is used as input for rigorous analysis and evaluation by the quantitative analysis part of the framework. To this end, dynamic first‐principles modeling is used to simulate the system behavior and thereby complement the results of the qualitative analysis part. The practical framework for computer...

  5. Quantitative model of the growth of floodplains by vertical accretion

    Science.gov (United States)

    Moody, J.A.; Troutman, B.M.

    2000-01-01

    A simple one-dimensional model is developed to quantitatively predict the change in elevation, over a period of decades, for vertically accreting floodplains. This unsteady model approximates the monotonic growth of a floodplain as an incremental but constant increase of net sediment deposition per flood for those floods of a partial duration series that exceed a threshold discharge corresponding to the elevation of the floodplain. Sediment deposition from each flood increases the elevation of the floodplain and consequently the magnitude of the threshold discharge resulting in a decrease in the number of floods and growth rate of the floodplain. Floodplain growth curves predicted by this model are compared to empirical growth curves based on dendrochronology and to direct field measurements at five floodplain sites. The model was used to predict the value of net sediment deposition per flood which best fits (in a least squares sense) the empirical and field measurements; these values fall within the range of independent estimates of the net sediment deposition per flood based on empirical equations. These empirical equations permit the application of the model to estimate of floodplain growth for other floodplains throughout the world which do not have detailed data of sediment deposition during individual floods. Copyright (C) 2000 John Wiley and Sons, Ltd.

  6. Quantitative Model for Estimating Soil Erosion Rates Using 137Cs

    Institute of Scientific and Technical Information of China (English)

    YANGHAO; GHANGQING; 等

    1998-01-01

    A quantitative model was developed to relate the amount of 137Cs loss from the soil profile to the rate of soil erosion,According th mass balance model,the depth distribution pattern of 137Cs in the soil profile ,the radioactive decay of 137Cs,sampling year and the difference of 137Cs fallout amount among years were taken into consideration.By introducing typical depth distribution functions of 137Cs into the model ,detailed equations for the model were got for different soil,The model shows that the rate of soil erosion is mainly controlled by the depth distrbution pattern of 137Cs ,the year of sampling,and the percentage reduction in total 137Cs,The relationship between the rate of soil loss and 137Cs depletion i neither linear nor logarithmic,The depth distribution pattern of 137Cs is a major factor for estimating the rate of soil loss,Soil erosion rate is directly related with the fraction of 137Cs content near the soil surface. The influences of the radioactive decay of 137Cs,sampling year and 137Cs input fraction are not large compared with others.

  7. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  8. Quantitative colorimetric-imaging analysis of nickel in iron meteorites.

    Science.gov (United States)

    Zamora, L Lahuerta; López, P Alemán; Fos, G M Antón; Algarra, R Martín; Romero, A M Mellado; Calatayud, J Martínez

    2011-02-15

    A quantitative analytical imaging approach for determining the nickel content of metallic meteorites is proposed. The approach uses a digital image of a series of standard solutions of the nickel-dimethylglyoxime coloured chelate and a meteorite sample solution subjected to the same treatment as the nickel standards for quantitation. The image is processed with suitable software to assign a colour-dependent numerical value (analytical signal) to each standard. Such a value is directly proportional to the analyte concentration, which facilitates construction of a calibration graph where the value for the unknown sample can be interpolated to calculate the nickel content of the meteorite. The results thus obtained were validated by comparison with the official, ISO-endorsed spectrophotometric method for nickel. The proposed method is fairly simple and inexpensive; in fact, it uses a commercially available digital camera as measuring instrument and the images it provides are processed with highly user-friendly public domain software (specifically, ImageJ, developed by the National Institutes of Health and freely available for download on the Internet). In a scenario dominated by increasingly sophisticated and expensive equipment, the proposed method provides a cost-effective alternative based on simple, robust hardware that is affordable and can be readily accessed worldwide. This can be especially advantageous for countries were available resources for analytical equipment investments are scant. The proposed method is essentially an adaptation of classical chemical analysis to current, straightforward, robust, cost-effective instrumentation. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. Quantitative 3D investigation of Neuronal network in mouse spinal cord model

    Science.gov (United States)

    Bukreeva, I.; Campi, G.; Fratini, M.; Spanò, R.; Bucci, D.; Battaglia, G.; Giove, F.; Bravin, A.; Uccelli, A.; Venturi, C.; Mastrogiacomo, M.; Cedola, A.

    2017-01-01

    The investigation of the neuronal network in mouse spinal cord models represents the basis for the research on neurodegenerative diseases. In this framework, the quantitative analysis of the single elements in different districts is a crucial task. However, conventional 3D imaging techniques do not have enough spatial resolution and contrast to allow for a quantitative investigation of the neuronal network. Exploiting the high coherence and the high flux of synchrotron sources, X-ray Phase-Contrast multiscale-Tomography allows for the 3D investigation of the neuronal microanatomy without any aggressive sample preparation or sectioning. We investigated healthy-mouse neuronal architecture by imaging the 3D distribution of the neuronal-network with a spatial resolution of 640 nm. The high quality of the obtained images enables a quantitative study of the neuronal structure on a subject-by-subject basis. We developed and applied a spatial statistical analysis on the motor neurons to obtain quantitative information on their 3D arrangement in the healthy-mice spinal cord. Then, we compared the obtained results with a mouse model of multiple sclerosis. Our approach paves the way to the creation of a “database” for the characterization of the neuronal network main features for a comparative investigation of neurodegenerative diseases and therapies.

  10. Goal relevance as a quantitative model of human task relevance.

    Science.gov (United States)

    Tanner, James; Itti, Laurent

    2017-03-01

    The concept of relevance is used ubiquitously in everyday life. However, a general quantitative definition of relevance has been lacking, especially as pertains to quantifying the relevance of sensory observations to one's goals. We propose a theoretical definition for the information value of data observations with respect to a goal, which we call "goal relevance." We consider the probability distribution of an agent's subjective beliefs over how a goal can be achieved. When new data are observed, its goal relevance is measured as the Kullback-Leibler divergence between belief distributions before and after the observation. Theoretical predictions about the relevance of different obstacles in simulated environments agreed with the majority response of 38 human participants in 83.5% of trials, beating multiple machine-learning models. Our new definition of goal relevance is general, quantitative, explicit, and allows one to put a number onto the previously elusive notion of relevance of observations to a goal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Quantitative analysis of biopolymers by matrix-assisted laser desorption

    Energy Technology Data Exchange (ETDEWEB)

    Tang, K.; Allman, S.L.; Jones, R.B.; Chen, C.H. (Oak Ridge National Lab., TN (United States))

    1993-08-01

    During the past few years, major efforts have been made to use mass spectrometry to measure biopolymers because of the great potential benefit to biological and medical research. Although the theoretical details of laser desorption and ionization mechanisms of MALDI are not yet fully understood, several models have been presented to explain the production of large biopolymer ions. In brief, it is very difficult to obtain reliable measurements of the absolute quantity of analytes by MALDI. If MALDI is going to become a routine analytical tool, it is obvious that quantitative measurement capability must be pursued. Oligonucleotides and protein samples used in this work were purchased from commercial sources. Nicotinic acid was used as matrix for both types of biopolymers. From this experiment, it is seen that it is difficult to obtain absolute quantitative measurements of biopolymers using MALDI. However, internal calibration with molecules having similar chemical properties can be used to resolve these difficulties. Chemical reactions between biopolymers must be avoided to prevent the destruction of the analyte materials. 10 refs., 8 figs.

  12. Review on modelling aspects in reversed-phase liquid chromatographic quantitative structure-retention relationships

    Energy Technology Data Exchange (ETDEWEB)

    Put, R. [FABI, Department of Analytical Chemistry and Pharmaceutical Technology, Pharmaceutical Institute, Vrije Universiteit Brussel (VUB), Laarbeeklaan 103, B-1090 Brussels (Belgium); Vander Heyden, Y. [FABI, Department of Analytical Chemistry and Pharmaceutical Technology, Pharmaceutical Institute, Vrije Universiteit Brussel (VUB), Laarbeeklaan 103, B-1090 Brussels (Belgium)], E-mail: yvanvdh@vub.ac.be

    2007-10-29

    In the literature an increasing interest in quantitative structure-retention relationships (QSRR) can be observed. After a short introduction on QSRR and other strategies proposed to deal with the starting point selection problem prior to method development in reversed-phase liquid chromatography, a number of interesting papers is reviewed, dealing with QSRR models for reversed-phase liquid chromatography. The main focus in this review paper is put on the different modelling methodologies applied and the molecular descriptors used in the QSRR approaches. Besides two semi-quantitative approaches (i.e. principal component analysis, and decision trees), these methodologies include artificial neural networks, partial least squares, uninformative variable elimination partial least squares, stochastic gradient boosting for tree-based models, random forests, genetic algorithms, multivariate adaptive regression splines, and two-step multivariate adaptive regression splines.

  13. Quantitative analysis of multiple components based on liquid chromatography with mass spectrometry in full scan mode.

    Science.gov (United States)

    Xu, Min Li; Li, Bao Qiong; Wang, Xue; Chen, Jing; Zhai, Hong Lin

    2016-08-01

    Although liquid chromatography with mass spectrometry in full scan mode can obtain all the signals simultaneously in a large range and low cost, it is rarely used in quantitative analysis due to several problems such as chromatographic drifts and peak overlap. In this paper, we propose a Tchebichef moment method for the simultaneous quantitative analysis of three active compounds in Qingrejiedu oral liquid based on three-dimensional spectra in full scan mode of liquid chromatography with mass spectrometry. After the Tchebichef moments were calculated directly from the spectra, the quantitative linear models for three active compounds were established by stepwise regression. All the correlation coefficients were more than 0.9978. The limits of detection and limits of quantitation were less than 0.11 and 0.49 μg/mL, respectively. The intra- and interday precisions were less than 6.54 and 9.47%, while the recovery ranged from 102.56 to 112.15%. Owing to the advantages of multi-resolution and inherent invariance properties, Tchebichef moments could provide favorable results even in the situation of peaks shifting and overlapping, unknown interferences and noise signals, so it could be applied to the analysis of three-dimensional spectra in full scan mode of liquid chromatography with mass spectrometry.

  14. Spatial Quantitation of Drugs in tissues using Liquid Extraction Surface Analysis Mass Spectrometry Imaging

    Science.gov (United States)

    Swales, John G.; Strittmatter, Nicole; Tucker, James W.; Clench, Malcolm R.; Webborn, Peter J. H.; Goodwin, Richard J. A.

    2016-11-01

    Liquid extraction surface analysis mass spectrometry imaging (LESA-MSI) has been shown to be an effective tissue profiling and imaging technique, producing robust and reliable qualitative distribution images of an analyte or analytes in tissue sections. Here, we expand the use of LESA-MSI beyond qualitative analysis to a quantitative analytical technique by employing a mimetic tissue model previously shown to be applicable for MALDI-MSI quantitation. Liver homogenate was used to generate a viable and molecularly relevant control matrix for spiked drug standards which can be frozen, sectioned and subsequently analyzed for the generation of calibration curves to quantify unknown tissue section samples. The effects of extraction solvent composition, tissue thickness and solvent/tissue contact time were explored prior to any quantitative studies in order to optimize the LESA-MSI method across several different chemical entities. The use of a internal standard to normalize regional differences in ionization response across tissue sections was also investigated. Data are presented comparing quantitative results generated by LESA-MSI to LC-MS/MS. Subsequent analysis of adjacent tissue sections using DESI-MSI is also reported.

  15. Spatial Quantitation of Drugs in tissues using Liquid Extraction Surface Analysis Mass Spectrometry Imaging.

    Science.gov (United States)

    Swales, John G; Strittmatter, Nicole; Tucker, James W; Clench, Malcolm R; Webborn, Peter J H; Goodwin, Richard J A

    2016-11-24

    Liquid extraction surface analysis mass spectrometry imaging (LESA-MSI) has been shown to be an effective tissue profiling and imaging technique, producing robust and reliable qualitative distribution images of an analyte or analytes in tissue sections. Here, we expand the use of LESA-MSI beyond qualitative analysis to a quantitative analytical technique by employing a mimetic tissue model previously shown to be applicable for MALDI-MSI quantitation. Liver homogenate was used to generate a viable and molecularly relevant control matrix for spiked drug standards which can be frozen, sectioned and subsequently analyzed for the generation of calibration curves to quantify unknown tissue section samples. The effects of extraction solvent composition, tissue thickness and solvent/tissue contact time were explored prior to any quantitative studies in order to optimize the LESA-MSI method across several different chemical entities. The use of a internal standard to normalize regional differences in ionization response across tissue sections was also investigated. Data are presented comparing quantitative results generated by LESA-MSI to LC-MS/MS. Subsequent analysis of adjacent tissue sections using DESI-MSI is also reported.

  16. Quantitative Modeling of Human-Environment Interactions in Preindustrial Time

    Science.gov (United States)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-04-01

    Quantifying human-environment interactions and anthropogenic influences on the environment prior to the Industrial revolution is essential for understanding the current state of the earth system. This is particularly true for the terrestrial biosphere, but marine ecosystems and even climate were likely modified by human activities centuries to millennia ago. Direct observations are however very sparse in space and time, especially as one considers prehistory. Numerical models are therefore essential to produce a continuous picture of human-environment interactions in the past. Agent-based approaches, while widely applied to quantifying human influence on the environment in localized studies, are unsuitable for global spatial domains and Holocene timescales because of computational demands and large parameter uncertainty. Here we outline a new paradigm for the quantitative modeling of human-environment interactions in preindustrial time that is adapted to the global Holocene. Rather than attempting to simulate agency directly, the model is informed by a suite of characteristics describing those things about society that cannot be predicted on the basis of environment, e.g., diet, presence of agriculture, or range of animals exploited. These categorical data are combined with the properties of the physical environment in coupled human-environment model. The model is, at its core, a dynamic global vegetation model with a module for simulating crop growth that is adapted for preindustrial agriculture. This allows us to simulate yield and calories for feeding both humans and their domesticated animals. We couple this basic caloric availability with a simple demographic model to calculate potential population, and, constrained by labor requirements and land limitations, we create scenarios of land use and land cover on a moderate-resolution grid. We further implement a feedback loop where anthropogenic activities lead to changes in the properties of the physical

  17. Quantitative microstructure analysis of polymer-modified mortars.

    Science.gov (United States)

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  18. Ozone Determination: A Comparison of Quantitative Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rachmat Triandi Tjahjanto

    2012-10-01

    Full Text Available A comparison of ozone quantitative analysis methods by using spectrophotometric and volumetric method has been studied. The aim of this research is to determine the better method by considering the effect of reagent concentration and volume on the measured ozone concentration. Ozone which was analyzed in this research was synthesized from air, then it is used to ozonize methyl orange and potassium iodide solutions at different concentration and volume. Ozonation was held for 20 minutes with 363 mL/minutes air flow rates. The concentrations of ozonized methyl orange and potassium iodide solutions was analyzed by spectrophotometric and volumetric method, respectively. The result of this research shows that concentration and volume of reagent having an effect on the measured ozone concentration. Based on the results of both methods, it can be concluded that volumetric method is better than spectrophotometric method.

  19. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  20. qfasar: quantitative fatty acid signature analysis with R

    Science.gov (United States)

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  1. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  2. Quantitative analysis of forest island pattern in selected Ohio landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  3. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  4. [Application of uncertainty assessment in NIR quantitative analysis of traditional Chinese medicine].

    Science.gov (United States)

    Xue, Zhong; Xu, Bing; Liu, Qian; Shi, Xin-Yuan; Li, Jian-Yu; Wu, Zhi-Sheng; Qiao, Yan-Jiang

    2014-10-01

    The near infrared (NIR) spectra of Liuyi San samples were collected during the mixing process and the quantitative models by PLS (partial least squares) method were generated for the quantification of the concentration of glycyrrhizin. The PLS quantitative model had good calibration and prediction performances (r(cal) 0.998 5, RMSEC = 0.044 mg · g(-1); r(val) = 0.947 4, RMSEP = 0.124 mg · g(-1)), indicating that NIR spectroscopy can be used as a rapid determination method of the concentration of glycyrrhizin in Liuyi San powder. After the validation tests were designed, the Liao-Lin-Iyer approach based on Monte Carlo simulation was used to estimate β-content-γ-confidence tolerance intervals. Then the uncertainty was calculated, and the uncer- tainty profile was drawn. The NIR analytical method was considered valid when the concentration of glycyrrhizin is above 1.56 mg · g(-1) since the uncertainty fell within the acceptable limits (λ = ± 20%). The results showed that uncertainty assessment can be used in NIR quantitative models of glycyrrhizin for different concentrations and provided references for other traditional Chinese medicine to finish the uncertainty assessment using NIR quantitative analysis.

  5. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    Science.gov (United States)

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates pept

  7. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  8. QTL analysis for some quantitative traits in bread wheat

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection.

  9. Mechanics of neutrophil phagocytosis: experiments and quantitative models.

    Science.gov (United States)

    Herant, Marc; Heinrich, Volkmar; Dembo, Micah

    2006-05-01

    To quantitatively characterize the mechanical processes that drive phagocytosis, we observed the FcgammaR-driven engulfment of antibody-coated beads of diameters 3 mum to 11 mum by initially spherical neutrophils. In particular, the time course of cell morphology, of bead motion and of cortical tension were determined. Here, we introduce a number of mechanistic models for phagocytosis and test their validity by comparing the experimental data with finite element computations for multiple bead sizes. We find that the optimal models involve two key mechanical interactions: a repulsion or pressure between cytoskeleton and free membrane that drives protrusion, and an attraction between cytoskeleton and membrane newly adherent to the bead that flattens the cell into a thin lamella. Other models such as cytoskeletal expansion or swelling appear to be ruled out as main drivers of phagocytosis because of the characteristics of bead motion during engulfment. We finally show that the protrusive force necessary for the engulfment of large beads points towards storage of strain energy in the cytoskeleton over a large distance from the leading edge ( approximately 0.5 microm), and that the flattening force can plausibly be generated by the known concentrations of unconventional myosins at the leading edge.

  10. Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems

    CERN Document Server

    Ndukwu, Ukachukwu

    2009-01-01

    This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...

  11. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    problem was the basis for an inverse problem, i.e. revealing depth of a body occurrence, its location in a space as well as determining physical properties. At the same time, this method has not received a wide practical application taking into account complexity of real geological media. Careful analysis piezo- and seismoelectric anomalies shows the possibility of application of quantitative analysis of these effects advanced methodologies developed in magnetic prospecting for complex physical-geological conditions (Eppelbaum et al., 2000, 2001, 2010; Eppelbaum, 2010; 2011, 2015). Employment of these methodologies (improved modifications of tangents, characteristic points areal methods) for obtaining quantitative characteristics of ore bodies, environmental features and archaeological targets (models of horizontal circular cylinder, sphere, thin bed, thick bed and thin horizontal plate were utilized) have demonstrated their effectiveness. Case study at the archaeological site Tel Kara Hadid Field piezoelectric observations were conducted at the ancient archaeological site Tel Kara Hadid with gold-quartz mineralization in southern Israel within the Precambrian terrain at the northern extension of the Arabian-Nubian Shield (Neishtadt et al., 2006). The area of the archaeological site is located eight kilometers north of the town of Eilat, in an area of strong industrial noise. Ancient river alluvial terraces (extremely heterogeneous at a local scale, varying from boulders to silt) cover the quartz veins and complicate their identification. Piezoelectric measurements conducted over a quartz vein covered by surface sediments (approximately of 0.4 m thickness) produced a sharp (500 μV ) piezoelectric anomaly. Values recorded over the host rocks (clays and shales of basic composition) were close to zero. The observed piezoelectric anomaly was successfully interpreted by the use of methodologies developed in magnetic prospecting. For effective integration of piezo- and

  12. A quantitative approach for comparing modeled biospheric carbon flux estimates across regional scales

    Directory of Open Access Journals (Sweden)

    D. N. Huntzinger

    2010-10-01

    Full Text Available Given the large differences between biospheric model estimates of regional carbon exchange, there is a need to understand and reconcile the predicted spatial variability of fluxes across models. This paper presents a set of quantitative tools that can be applied for comparing flux estimates in light of the inherent differences in model formulation. The presented methods include variogram analysis, variable selection, and geostatistical regression. These methods are evaluated in terms of their ability to assess and identify differences in spatial variability in flux estimates across North America among a small subset of models, as well as differences in the environmental drivers that appear to have the greatest control over the spatial variability of predicted fluxes. The examined models are the Simple Biosphere (SiB 3.0, Carnegie Ames Stanford Approach (CASA, and CASA coupled with the Global Fire Emissions Database (CASA GFEDv2, and the analyses are performed on model-predicted net ecosystem exchange, gross primary production, and ecosystem respiration. Variogram analysis reveals consistent seasonal differences in spatial variability among modeled fluxes at a 1°×1° spatial resolution. However, significant differences are observed in the overall magnitude of the carbon flux spatial variability across models, in both net ecosystem exchange and component fluxes. Results of the variable selection and geostatistical regression analyses suggest fundamental differences between the models in terms of the factors that control the spatial variability of predicted flux. For example, carbon flux is more strongly correlated with percent land cover in CASA GFEDv2 than in SiB or CASA. Some of these factors can be linked back to model formulation, and would have been difficult to identify simply by comparing net fluxes between models. Overall, the quantitative approach presented here provides a set of tools for comparing predicted grid-scale fluxes across

  13. Quantitative Modeling of the Alternative Pathway of the Complement System.

    Science.gov (United States)

    Zewde, Nehemiah; Gorham, Ronald D; Dorado, Angel; Morikis, Dimitrios

    2016-01-01

    The complement system is an integral part of innate immunity that detects and eliminates invading pathogens through a cascade of reactions. The destructive effects of the complement activation on host cells are inhibited through versatile regulators that are present in plasma and bound to membranes. Impairment in the capacity of these regulators to function in the proper manner results in autoimmune diseases. To better understand the delicate balance between complement activation and regulation, we have developed a comprehensive quantitative model of the alternative pathway. Our model incorporates a system of ordinary differential equations that describes the dynamics of the four steps of the alternative pathway under physiological conditions: (i) initiation (fluid phase), (ii) amplification (surfaces), (iii) termination (pathogen), and (iv) regulation (host cell and fluid phase). We have examined complement activation and regulation on different surfaces, using the cellular dimensions of a characteristic bacterium (E. coli) and host cell (human erythrocyte). In addition, we have incorporated neutrophil-secreted properdin into the model highlighting the cross talk of neutrophils with the alternative pathway in coordinating innate immunity. Our study yields a series of time-dependent response data for all alternative pathway proteins, fragments, and complexes. We demonstrate the robustness of alternative pathway on the surface of pathogens in which complement components were able to saturate the entire region in about 54 minutes, while occupying less than one percent on host cells at the same time period. Our model reveals that tight regulation of complement starts in fluid phase in which propagation of the alternative pathway was inhibited through the dismantlement of fluid phase convertases. Our model also depicts the intricate role that properdin released from neutrophils plays in initiating and propagating the alternative pathway during bacterial infection.

  14. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  15. Quantitative analysis with the optoacoustic/ultrasound system OPUS

    Science.gov (United States)

    Haisch, Christoph; Zell, Karin; Sperl, Jonathan; Vogel, Mika W.; Niessner, Reinhard

    2009-02-01

    The OPUS (Optoacoustic plus Ultrasound) system is a combination of a medical ultrasound scanner with a highrepetition rate, wavelength-tunable laser system and a suitable triggering interface to synchronize the laser and the ultrasound system. The pulsed laser generates an optoacoustic (OA), or photoacoustic (PA), signal which is detected by the ultrasound system. Alternatively, imaging in conventional ultrasound mode can be performed. Both imaging modes can be superimposed. The laser light is coupled into the tissue laterally, parallel to the ultrasound transducer, which does not require for any major modification to the transducer or the ultrasound beam forming. This was a basic requirement on the instrument, as the intention of the project was to establish the optoacoustic imaging modality as add-on to a conventional standard ultrasound instrument. We believe that this approach may foster the introduction of OA imaging as routine tool in medical diagnosis. Another key aspect of the project was to exploit the capabilities of OA imaging for quantitative analysis. The intention of the presented work is to summarize all steps necessary to extract the significant information from the PA raw data, which are required for the quantification of local absorber distributions. We show results of spatially resolved absorption measurements in scattering samples and a comparison of four different image reconstruction algorithms, regarding their influence on lateral resolution as well as on the signal to noise ratio for different sample depths and absorption values. The reconstruction algorithms are based on Fourier transformation, on a generalized 2D Hough transformation, on circular back-projection and the classical delay-and-sum approach which is implemented in most ultrasound scanners. Furthermore, we discuss the influence of a newly developed laser source, combining diode and flash lamp pumping. Compared to all-flash-lamp pumped systems it features a significantly higher

  16. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Science.gov (United States)

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  17. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  18. Quantitative analysis of cryptic splicing associated with TDP-43 depletion.

    Science.gov (United States)

    Humphrey, Jack; Emmett, Warren; Fratta, Pietro; Isaacs, Adrian M; Plagnol, Vincent

    2017-05-26

    Reliable exon recognition is key to the splicing of pre-mRNAs into mature mRNAs. TDP-43 is an RNA-binding protein whose nuclear loss and cytoplasmic aggregation are a hallmark pathology in amyotrophic lateral sclerosis and frontotemporal dementia (ALS/FTD). TDP-43 depletion causes the aberrant inclusion of cryptic exons into a range of transcripts, but their extent, relevance to disease pathogenesis and whether they are caused by other RNA-binding proteins implicated in ALS/FTD are unknown. We developed an analysis pipeline to discover and quantify cryptic exon inclusion and applied it to publicly available human and murine RNA-sequencing data. We detected widespread cryptic splicing in TDP-43 depletion datasets but almost none in another ALS/FTD-linked protein FUS. Sequence motif and iCLIP analysis of cryptic exons demonstrated that they are bound by TDP-43. Unlike the cryptic exons seen in hnRNP C depletion, those repressed by TDP-43 cannot be linked to transposable elements. Cryptic exons are poorly conserved and inclusion overwhelmingly leads to nonsense-mediated decay of the host transcript, with reduced transcript levels observed in differential expression analysis. RNA-protein interaction data on 73 different RNA-binding proteins showed that, in addition to TDP-43, 7 specifically bind TDP-43 linked cryptic exons. This suggests that TDP-43 competes with other splicing factors for binding to cryptic exons and can repress cryptic exon inclusion. Our quantitative analysis pipeline confirms the presence of cryptic exons during the depletion of TDP-43 but not FUS providing new insight into to RNA-processing dysfunction as a cause or consequence in ALS/FTD.

  19. An efficient approach to the quantitative analysis of humic acid in water.

    Science.gov (United States)

    Wang, Xue; Li, Bao Qiong; Zhai, Hong Lin; Xiong, Meng Yi; Liu, Ying

    2016-01-01

    Rayleigh and Raman scatterings inevitably appear in fluorescence measurements, which make the quantitative analysis more difficult, especially in the overlap of target signals and scattering signals. Based on the grayscale images of three-dimensional fluorescence spectra, the linear model with two selected Zernike moments was established for the determination of humic acid, and applied to the quantitative analysis of the real sample taken from the Yellow River. The correlation coefficient (R(2)) and leave-one-out cross validation correlation coefficient (R(2)cv) were up to 0.9994 and 0.9987, respectively. The average recoveries were reached 96.28%. Compared with N-way partial least square and alternating trilinear decomposition methods, our approach was immune from the scattering and noise signals owing to its powerful multi-resolution characteristic and the obtained results were more reliable and accurate, which could be applied in food analyses.

  20. Quantitative analysis of surface characteristics and morphology in Death Valley, California using AIRSAR data

    Science.gov (United States)

    Kierein-Young, K. S.; Kruse, F. A.; Lefkoff, A. B.

    1992-01-01

    The Jet Propulsion Laboratory Airborne Synthetic Aperture Radar (JPL-AIRSAR) is used to collect full polarimetric measurements at P-, L-, and C-bands. These data are analyzed using the radar analysis and visualization environment (RAVEN). The AIRSAR data are calibrated using in-scene corner reflectors to allow for quantitative analysis of the radar backscatter. RAVEN is used to extract surface characteristics. Inversion models are used to calculate quantitative surface roughness values and fractal dimensions. These values are used to generate synthetic surface plots that represent the small-scale surface structure of areas in Death Valley. These procedures are applied to a playa, smooth salt-pan, and alluvial fan surfaces in Death Valley. Field measurements of surface roughness are used to verify the accuracy.

  1. Geothermal Power Plant Maintenance: Evaluating Maintenance System Needs Using Quantitative Kano Analysis

    Directory of Open Access Journals (Sweden)

    Reynir S. Atlason

    2014-07-01

    Full Text Available A quantitative Kano model is used in this study to identify which features are preferred by top-level maintenance engineers within Icelandic geothermal power plants to be implemented in a maintenance tool or software. Visits were conducted to the largest Icelandic energy companies operating geothermal power plants. Thorough interviews with chiefs of operations and maintenance were used as a basis for a quantitative Kano analysis. Thirty seven percent of all maintenance engineers at Reykjavik Energy and Landsvirkjun, responsible for 71.5% of the total energy production from geothermal resources in Iceland, answered the Kano questionnaire. Findings show that solutions focusing on (1 planning maintenance according to condition; (2 shortening documentation times; and (3 risk analysis are sought after by the energy companies but not provided for the geothermal sector specifically.

  2. Quantitative proteomic analysis of human lung tumor xenografts treated with the ectopic ATP synthase inhibitor citreoviridin.

    Directory of Open Access Journals (Sweden)

    Yi-Hsuan Wu

    Full Text Available ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.

  3. Imaging with low-voltage scanning transmission electron microscopy: A quantitative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Felisari, L. [TASC, INFM-CNR, S.S. 14, km 163.5, 34149 Trieste (Italy); Grillo, V., E-mail: vincenzo.grillo@unimore.it [Istituto Nanoscienze-S3 CNR, via Campi 213/A, 41125 Modena (Italy); IMEM-CNR Parco Area delle Scienze 37/A, 43124 Parma (Italy); Jabeen, F.; Rubini, S. [TASC, INFM-CNR, S.S. 14, km 163.5, 34149 Trieste (Italy); Menozzi, C. [Istituto Nanoscienze-S3 CNR, via Campi 213/A, 41125 Modena (Italy); Dipartimento di Fisica, Universita di Modena e Reggio Emilia Via G. Campi 213/A, 41100 Modena (Italy); Rossi, F. [IMEM-CNR Parco Area delle Scienze 37/A, 43124 Parma (Italy); Martelli, F. [TASC, INFM-CNR, S.S. 14, km 163.5, 34149 Trieste (Italy); IMM-CNR, via del Fosso del Cavaliere 100, 00133 Roma (Italy)

    2011-07-15

    A dedicated specimen holder has been designed to perform low-voltage scanning transmission electron microscopy in dark field mode. Different test samples, namely InGaAs/GaAs quantum wells, InGaAs nanowires and thick InGaAs layers, have been analysed to test the reliability of the model based on the proportionality to the specimen mass-thickness, generally used for image intensity interpretation of scattering contrast processes. We found that size of the probe, absorption and channelling must be taken into account to give a quantitative interpretation of image intensity. We develop a simple procedure to evaluate the probe-size effect and to obtain a quantitative indication of the absorption coefficient. Possible artefacts induced by channelling are pointed out. With the developed procedure, the low voltage approach can be successfully applied for quantitative compositional analysis. The method is then applied to the estimation of the In content in the core of InGaAs/GaAs core-shell nanowires. -- Highlights: {yields} Quantitative analysis of the composition by low-voltage STEM annular dark field. {yields} First evidence of channelling effects in low-voltage STEM in SEM. {yields} Comparison between low-voltage and high-voltage STEM. {yields} Evaluation of the absorption effects on the STEM intensity.

  4. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  5. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  6. Correlation between two methods of florbetapir PET quantitative analysis.

    Science.gov (United States)

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  7. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    Science.gov (United States)

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  8. Therapeutic electrical stimulation for spasticity: quantitative gait analysis.

    Science.gov (United States)

    Pease, W S

    1998-01-01

    Improvement in motor function following electrical stimulation is related to strengthening of the stimulated spastic muscle and inhibition of the antagonist. A 26-year-old man with familial spastic paraparesis presented with gait dysfunction and bilateral lower limb spastic muscle tone. Clinically, muscle strength and sensation were normal. He was considered appropriate for a trial of therapeutic electrical stimulation following failed trials of physical therapy and baclofen. No other treatment was used concurrent with the electrical stimulation. Before treatment, quantitative gait analysis revealed 63% of normal velocity and a crouched gait pattern, associated with excessive electromyographic activity in the hamstrings and gastrocnemius muscles. Based on these findings, bilateral stimulation of the quadriceps and anterior compartment musculature was performed two to three times per week for three months. Repeat gait analysis was conducted three weeks after the cessation of stimulation treatment. A 27% increase in velocity was noted associated with an increase in both cadence and right step length. Right hip and bilateral knee stance motion returned to normal (rather than "crouched"). No change in the timing of dynamic electromyographic activity was seen. These findings suggest a role for the use of electrical stimulation for rehabilitation of spasticity. The specific mechanism of this improvement remains uncertain.

  9. From classical genetics to quantitative genetics to systems biology: modeling epistasis.

    Directory of Open Access Journals (Sweden)

    David L Aylor

    2008-03-01

    Full Text Available Gene expression data has been used in lieu of phenotype in both classical and quantitative genetic settings. These two disciplines have separate approaches to measuring and interpreting epistasis, which is the interaction between alleles at different loci. We propose a framework for estimating and interpreting epistasis from a classical experiment that combines the strengths of each approach. A regression analysis step accommodates the quantitative nature of expression measurements by estimating the effect of gene deletions plus any interaction. Effects are selected by significance such that a reduced model describes each expression trait. We show how the resulting models correspond to specific hierarchical relationships between two regulator genes and a target gene. These relationships are the basic units of genetic pathways and genomic system diagrams. Our approach can be extended to analyze data from a variety of experiments, multiple loci, and multiple environments.

  10. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  11. A quantitative model for assessing community dynamics of pleistocene mammals.

    Science.gov (United States)

    Lyons, S Kathleen

    2005-06-01

    Previous studies have suggested that species responded individualistically to the climate change of the last glaciation, expanding and contracting their ranges independently. Consequently, many researchers have concluded that community composition is plastic over time. Here I quantitatively assess changes in community composition over broad timescales and assess the effect of range shifts on community composition. Data on Pleistocene mammal assemblages from the FAUNMAP database were divided into four time periods (preglacial, full glacial, postglacial, and modern). Simulation analyses were designed to determine whether the degree of change in community composition is consistent with independent range shifts, given the distribution of range shifts observed. Results indicate that many of the communities examined in the United States were more similar through time than expected if individual range shifts were completely independent. However, in each time transition examined, there were areas of nonanalogue communities. I conducted sensitivity analyses to explore how the results were affected by the assumptions of the null model. Conclusions about changes in mammalian distributions and community composition are robust with respect to the assumptions of the model. Thus, whether because of biotic interactions or because of common environmental requirements, community structure through time is more complex than previously thought.

  12. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  13. Quantitative risk analysis offshore-Human and organizational factors

    Energy Technology Data Exchange (ETDEWEB)

    Espen Skogdalen, Jon, E-mail: jon.espen.skogdalen@gmail.co [Department of Industrial Economics, Risk Management and Planning, University of Stavanger, 4036 Stavanger (Norway); Vinnem, Jan Erik, E-mail: jev@preventor.n [Department of Industrial Economics, Risk Management and Planning, University of Stavanger, 4036 Stavanger (Norway)

    2011-04-15

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  14. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  15. Complex Politics: A Quantitative Semantic and Topological Analysis of UK House of Commons Debates

    CERN Document Server

    Gurciullo, Stefano; Pereda, María; Battiston, Federico; Patania, Alice; Poledna, Sebastian; Hedblom, Daniel; Oztan, Bahattin Tolga; Herzog, Alexander; John, Peter; Mikhaylov, Slava

    2015-01-01

    This study is a first, exploratory attempt to use quantitative semantics techniques and topological analysis to analyze systemic patterns arising in a complex political system. In particular, we use a rich data set covering all speeches and debates in the UK House of Commons between 1975 and 2014. By the use of dynamic topic modeling (DTM) and topological data analysis (TDA) we show that both members and parties feature specific roles within the system, consistent over time, and extract global patterns indicating levels of political cohesion. Our results provide a wide array of novel hypotheses about the complex dynamics of political systems, with valuable policy applications.

  16. Developmental quantitative genetic analysis of body weights and morphological traits in the turbot, Scophthalmusmaximus

    Institute of Scientific and Technical Information of China (English)

    WANG Xinan; MA Aijun; MA Deyou

    2015-01-01

    In order to elucidate the genetic mechanism of growth traits in turbot during ontogeny, developmental genetic analysis of the body weights, total lengths, standard lengths and body heights of turbots was conducted by mixed genetic models with additive-dominance effects, based on complete diallel crosses with four different strains of Scophthalmus maximus from Denmark, Norway, Britain, and France. Unconditional genetic analysis revealed that the unconditional additive effects for the four traits were more significant than unconditional dominance effects, meanwhile, the alternative expressions were also observed between the additive and dominant effects for body weights, total lengths and standard lengths. Conditional analysis showed that the developmental periods with active gene expression for body weights, total lengths, standard lengths and body heights were 15–18, 15 and 21–24, 15 and 24, and 21 and 27 months of age, respectively. The proportions of unconditional/conditional variances indicated that the narrow-sense heritabilities of body weights, total lengths and standard lengths were all increased systematically. The accumulative effects of genes controlling the four quantitative traits were mainly additive effects, suggesting that the selection is more efficient for the genetic improvement of turbots. The conditional genetic procedure is a useful tool to understand the expression of genes controlling developmental quantitative traits at a specific developmental period (t-1→t) during ontogeny. It is also important to determine the appropriate developmental period (t-1→t) for trait measurement in developmental quantitative genetic analysis in fish.

  17. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  18. Quantitative phase-field modeling of nonisothermal solidification in dilute multicomponent alloys with arbitrary diffusivities.

    Science.gov (United States)

    Ohno, Munekazu

    2012-11-01

    A quantitative phase-field model is developed for simulating microstructural pattern formation in nonisothermal solidification in dilute multicomponent alloys with arbitrary thermal and solutal diffusivities. By performing the matched asymptotic analysis, it is shown that the present model with antitrapping current terms reproduces the free-boundary problem of interest in the thin-interface limit. Convergence of the simulation outcome with decreasing the interface thickness is demonstrated for nonisothermal free dendritic growth in binary alloys and isothermal and nonisothermal free dendritic growth in a ternary alloy.

  19. A quantitative comparison of the TERA modeling and DFT magnetic resonance image reconstruction techniques.

    Science.gov (United States)

    Smith, M R; Nichols, S T; Constable, R T; Henkelman, R M

    1991-05-01

    The resolution of magnetic resonance images reconstructed using the discrete Fourier transform (DFT) algorithm is limited by the effective window generated by the finite data length. The transient error reconstruction approach (TERA) is an alternative reconstruction method based on autoregressive moving average (ARMA) modeling techniques. Quantitative measurements comparing the truncation artifacts present during DFT and TERA image reconstruction show that the modeling method substantially reduces these artifacts on "full" (256 X 256), "truncated" (256 X 192), and "severely truncated" (256 X 128) data sets without introducing the global amplitude distortion found in other modeling techniques. Two global measures for determining the success of modeling are suggested. Problem areas for one-dimensional modeling are examined and reasons for considering two-dimensional modeling discussed. Analysis of both medical and phantom data reconstructions are presented.

  20. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    Science.gov (United States)

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line.

  1. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  2. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  3. Quantitative property-structural relation modeling on polymeric dielectric materials

    Science.gov (United States)

    Wu, Ke

    Nowadays, polymeric materials have attracted more and more attention in dielectric applications. But searching for a material with desired properties is still largely based on trial and error. To facilitate the development of new polymeric materials, heuristic models built using the Quantitative Structure Property Relationships (QSPR) techniques can provide reliable "working solutions". In this thesis, the application of QSPR on polymeric materials is studied from two angles: descriptors and algorithms. A novel set of descriptors, called infinite chain descriptors (ICD), are developed to encode the chemical features of pure polymers. ICD is designed to eliminate the uncertainty of polymer conformations and inconsistency of molecular representation of polymers. Models for the dielectric constant, band gap, dielectric loss tangent and glass transition temperatures of organic polymers are built with high prediction accuracy. Two new algorithms, the physics-enlightened learning method (PELM) and multi-mechanism detection, are designed to deal with two typical challenges in material QSPR. PELM is a meta-algorithm that utilizes the classic physical theory as guidance to construct the candidate learning function. It shows better out-of-domain prediction accuracy compared to the classic machine learning algorithm (support vector machine). Multi-mechanism detection is built based on a cluster-weighted mixing model similar to a Gaussian mixture model. The idea is to separate the data into subsets where each subset can be modeled by a much simpler model. The case study on glass transition temperature shows that this method can provide better overall prediction accuracy even though less data is available for each subset model. In addition, the techniques developed in this work are also applied to polymer nanocomposites (PNC). PNC are new materials with outstanding dielectric properties. As a key factor in determining the dispersion state of nanoparticles in the polymer matrix

  4. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    Science.gov (United States)

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  5. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  6. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study repor

  7. APPLICATION OF NEOTAME IN CATCHUP: QUANTITATIVE DESCRIPTIVE AND PHYSICOCHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    G. C. M. C. BANNWART

    2008-11-01

    Full Text Available

    In this study, fi ve prototypes of catchup were developed by replacing partially or totally the sucrose in the formulation by the sweetener Neotame (NTM. These prototypes were evaluated for their physicochemical characteristics and sensory profi le (Quantitative Descriptive Analysis. The main sensory differences observed among the prototypes were regarding to color, consistency, mouthfeel, sweet taste and tomato taste, for which lower means were obtained as the sugar level was decreased, and also in terms of salty taste, that had higher means with the decrease of sugar. In terms of bitter and sweetener aftertastes, the prototype 100% sweetened with NTM presented the higher mean score, but with no signifi cant difference when compared to other prototypes containing sucrose, for bitter taste, however, it had the highest mean score, statistically different from all the other prototypes. In terms of physicochemical characteristics, the differences were mainly in terms of consistency, solids and color. Despite the differences observed among the prototypes as the sugar level was reduced, it was concluded that NTM is a suitable sweetener for catchup, both for use in reduced calories and no sugar versions.

  8. Quantitative Analysis of AGV System in FMS Cell Layout

    Directory of Open Access Journals (Sweden)

    B. Ramana

    1997-01-01

    Full Text Available Material handling is a specialised activity for a modern manufacturing concern. Automated guided vehicles (AGVs are invariably used for material handling in flexible manufacturing Systems (FMSs due to their flexibility. The quantitative analysis of an AGV system is useful for determining the material flow rates, operation times, length of delivery, length of empty move of AGV and the number of AGVs required for a typical FMS cell layout. The efficiency of the material handling system, such as AGV can be improved by reducing the length of empty move. The length of empty move of AGV depends upon despatching and scheduling methods. If these methods of AGVs are not properly planned, the length of empty move of AGV is greater than the length of delivery .This results in increase in material handling time which in turn increases the number of AGVs required in FMS cell. This paper presents a method for optimising the length of empty travel of AGV in a typical FMS cell layout.

  9. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    Science.gov (United States)

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  10. Quantitative produced water analysis using mobile 1H NMR

    Science.gov (United States)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  11. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    Science.gov (United States)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  12. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    A. O. Domanov

    2014-01-01

    Full Text Available Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial characteristics and citizens’ identities. As the geographical position at the border of Russia provides the citizens with geopolitical alternatives to identify their location as a fortress defending the nation (as in the case of Kronstadt or a bridge between cultures, the given study allows us to compare reasons for these geopolitical choices of inhabitants. Furthermore, the research aims at bridging the gap in the studies of European and multiple identity in Russian regions and provides Northwest Russian perspective on the perpetual discussion about subjective Eastern border of Europe.

  13. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    Energy Technology Data Exchange (ETDEWEB)

    Haase, A.T.; Zupancic, M.; Cavert, W. [Univ. of Minnesota Medical School, Minneapolis, MN (United States)] [and others

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  14. Quantitative Analysis and Comparisons of EPON Protection Schemes

    Institute of Scientific and Technical Information of China (English)

    CHENHong; JINDepeng; ZENGLieguang; SULi

    2005-01-01

    This paper presents the relationship between the intensity of network damage and the network survivability. Then a method for quantitatively analyzing the survivability of tree network is studied. Based on the analysis, the survivability of Ethernet passive optical network (EPON) with three kinds of protection schemes (i.e., Trunk-fiber protection scheme, Node-fiber protection scheme, and Bus-fiber protection) is discussed. Following this, the comparisons of the survivability among these three kinds of protection schemes of F.PON are put forward. The simulation results show that, when the coverage area is the same, the survivability of EPON with Node-fiber protection scheme is better than that of EPON with Trunk-fiber protection scheme, and when the number and distribution of Optical network unit (ONU) are the same, the survivability of EPON with Bus-fiber protection scheme is better than that of EPON with Nodefiber protection scheme. Under the same constraints, the needed fiber of EPON with Bus-fiber protection scheme is the least when there are more than 12 ONU nodes. These results are useful not only for forecasting and evaluating the survivability of EPON access network, but also for its topology design.

  15. Quantitative analysis of piperine in ayurvedic formulation by UV Spectrophotometry

    Directory of Open Access Journals (Sweden)

    Gupta Vishvnath

    2011-02-01

    Full Text Available A simple and reproducible UV- spectrophotometric method for the quantitative determination of piperine in Sitopaladi churna (STPLC were developed and validated in the present work. The parameters linearity, precision , accuracy, and standard error were studies according to indian herbal pharmacopiea. In this present study a new, simple, rapid, sensitive, precise and economic spectrophotometric method in ultraviolet region has been developed for the determination of piperine in market and laboratory herbal formulation of Sitopaladi churna. which were procured and purchased respectively from the local market and they were evaluated as per Indian herbal Pharmacopoeia and WHO guidelines. The concentration of piperine present in raw material of PSC was found to be 1.45±0.014 w/w in piper longum fruits. Piperine has the maximum wavelength at 342.5 nm and hence the UV spectrophotometric method was performed at 342.5 nm. The samples were prepared in methanol and methos obeys Beers law in concentration ranges employed for evaluation. The content of piperine in ayurvedic formulation was determined. The result of analysis have been validated statistically and recovery studies confirmed the accuracy of the proposed method. Hence the proposed method can be used for the reliable quantification of Piperine in crude drug and its herbal formulation.

  16. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  17. Quantitative analysis of 3-OH oxylipins in fermentation yeast.

    Science.gov (United States)

    Potter, Greg; Xia, Wei; Budge, Suzanne M; Speers, R Alex

    2017-02-01

    Despite the ubiquitous distribution of oxylipins in plants, animals, and microbes, and the application of numerous analytical techniques to study these molecules, 3-OH oxylipins have never been quantitatively assayed in yeasts. The formation of heptafluorobutyrate methyl ester derivatives and subsequent analysis with gas chromatography - negative chemical ionization - mass spectrometry allowed for the first determination of yeast 3-OH oxylipins. The concentration of 3-OH 10:0 (0.68-4.82 ng/mg dry cell mass) in the SMA strain of Saccharomyces pastorianus grown in laboratory-scale beverage fermentations was elevated relative to oxylipin concentrations in plant tissues and macroalgae. In fermenting yeasts, the onset of 3-OH oxylipin formation has been related to fermentation progression and flocculation initiation. When the SMA strain was grown in laboratory-scale fermentations, the maximal sugar consumption rate preceded the lowest concentration of 3-OH 10:0 by ∼4.5 h and a distinct increase in 3-OH 10:0 concentration by ∼16.5 h.

  18. Power analysis of artificial selection experiments using efficient whole genome simulation of quantitative traits.

    Science.gov (United States)

    Kessner, Darren; Novembre, John

    2015-04-01

    Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50-100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates.

  19. Quantitative Model for Supply Chain Visibility: Process Capability Perspective

    Directory of Open Access Journals (Sweden)

    Youngsu Lee

    2016-01-01

    Full Text Available Currently, the intensity of enterprise competition has increased as a result of a greater diversity of customer needs as well as the persistence of a long-term recession. The results of competition are becoming severe enough to determine the survival of company. To survive global competition, each firm must focus on achieving innovation excellence and operational excellence as core competency for sustainable competitive advantage. Supply chain management is now regarded as one of the most effective innovation initiatives to achieve operational excellence, and its importance has become ever more apparent. However, few companies effectively manage their supply chains, and the greatest difficulty is in achieving supply chain visibility. Many companies still suffer from a lack of visibility, and in spite of extensive research and the availability of modern technologies, the concepts and quantification methods to increase supply chain visibility are still ambiguous. Based on the extant researches in supply chain visibility, this study proposes an extended visibility concept focusing on a process capability perspective and suggests a more quantitative model using Z score in Six Sigma methodology to evaluate and improve the level of supply chain visibility.

  20. Communication about vaccinations in Italian websites: a quantitative analysis.

    Science.gov (United States)

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  1. A quantitative model of human DNA base excision repair. I. Mechanistic insights.

    Science.gov (United States)

    Sokhansanj, Bahrad A; Rodrigue, Garry R; Fitch, J Patrick; Wilson, David M

    2002-04-15

    Base excision repair (BER) is a multistep process involving the sequential activity of several proteins that cope with spontaneous and environmentally induced mutagenic and cytotoxic DNA damage. Quantitative kinetic data on single proteins of BER have been used here to develop a mathematical model of the BER pathway. This model was then employed to evaluate mechanistic issues and to determine the sensitivity of pathway throughput to altered enzyme kinetics. Notably, the model predicts considerably less pathway throughput than observed in experimental in vitro assays. This finding, in combination with the effects of pathway cooperativity on model throughput, supports the hypothesis of cooperation during abasic site repair and between the apurinic/apyrimidinic (AP) endonuclease, Ape1, and the 8-oxoguanine DNA glycosylase, Ogg1. The quantitative model also predicts that for 8-oxoguanine and hydrolytic AP site damage, short-patch Polbeta-mediated BER dominates, with minimal switching to the long-patch subpathway. Sensitivity analysis of the model indicates that the Polbeta-catalyzed reactions have the most control over pathway throughput, although other BER reactions contribute to pathway efficiency as well. The studies within represent a first step in a developing effort to create a predictive model for BER cellular capacity.

  2. [Multiple dependent variables LS-SVM regression algorithm and its application in NIR spectral quantitative analysis].

    Science.gov (United States)

    An, Xin; Xu, Shuo; Zhang, Lu-Da; Su, Shi-Guang

    2009-01-01

    In the present paper, on the basis of LS-SVM algorithm, we built a multiple dependent variables LS-SVM (MLS-SVM) regression model whose weights can be optimized, and gave the corresponding algorithm. Furthermore, we theoretically explained the relationship between MLS-SVM and LS-SVM. Sixty four broomcorn samples were taken as experimental material, and the sample ratio of modeling set to predicting set was 51 : 13. We first selected randomly and uniformly five weight groups in the interval [0, 1], and then in the way of leave-one-out (LOO) rule determined one appropriate weight group and parameters including penalizing parameters and kernel parameters in the model according to the criterion of the minimum of average relative error. Then a multiple dependent variables quantitative analysis model was built with NIR spectrum and simultaneously analyzed three chemical constituents containing protein, lysine and starch. Finally, the average relative errors between actual values and predicted ones by the model of three components for the predicting set were 1.65%, 6.47% and 1.37%, respectively, and the correlation coefficients were 0.9940, 0.8392 and 0.8825, respectively. For comparison, LS-SVM was also utilized, for which the average relative errors were 1.68%, 6.25% and 1.47%, respectively, and the correlation coefficients were 0.9941, 0.8310 and 0.8800, respectively. It is obvious that MLS-SVM algorithm is comparable to LS-SVM algorithm in modeling analysis performance, and both of them can give satisfying results. The result shows that the model with MLS-SVM algorithm is capable of doing multi-components NIR quantitative analysis synchronously. Thus MLS-SVM algorithm offers a new multiple dependent variables quantitative analysis approach for chemometrics. In addition, the weights have certain effect on the prediction performance of the model with MLS-SVM, which is consistent with our intuition and is validated in this study. Therefore, it is necessary to optimize

  3. Incorporation of caffeine into a quantitative model of fatigue and sleep.

    Science.gov (United States)

    Puckeridge, M; Fulcher, B D; Phillips, A J K; Robinson, P A

    2011-03-21

    A recent physiologically based model of human sleep is extended to incorporate the effects of caffeine on sleep-wake timing and fatigue. The model includes the sleep-active neurons of the hypothalamic ventrolateral preoptic area (VLPO), the wake-active monoaminergic brainstem populations (MA), their interactions with cholinergic/orexinergic (ACh/Orx) input to MA, and circadian and homeostatic drives. We model two effects of caffeine on the brain due to competitive antagonism of adenosine (Ad): (i) a reduction in the homeostatic drive and (ii) an increase in cholinergic activity. By comparing the model output to experimental data, constraints are determined on the parameters that describe the action of caffeine on the brain. In accord with experiment, the ranges of these parameters imply significant variability in caffeine sensitivity between individuals, with caffeine's effectiveness in reducing fatigue being highly dependent on an individual's tolerance, and past caffeine and sleep history. Although there are wide individual differences in caffeine sensitivity and thus in parameter values, once the model is calibrated for an individual it can be used to make quantitative predictions for that individual. A number of applications of the model are examined, using exemplar parameter values, including: (i) quantitative estimation of the sleep loss and the delay to sleep onset after taking caffeine for various doses and times; (ii) an analysis of the system's stable states showing that the wake state during sleep deprivation is stabilized after taking caffeine; and (iii) comparing model output successfully to experimental values of subjective fatigue reported in a total sleep deprivation study examining the reduction of fatigue with caffeine. This model provides a framework for quantitatively assessing optimal strategies for using caffeine, on an individual basis, to maintain performance during sleep deprivation.

  4. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Science.gov (United States)

    Hur, Kwang-Ho; Mueller, Joachim D

    2015-01-01

    The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  5. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  6. Quantitative nucleation and growth kinetics of gold nanoparticles via model-assisted dynamic spectroscopic approach.

    Science.gov (United States)

    Zhou, Yao; Wang, Huixuan; Lin, Wenshuang; Lin, Liqin; Gao, Yixian; Yang, Feng; Du, Mingming; Fang, Weiping; Huang, Jiale; Sun, Daohua; Li, Qingbiao

    2013-10-01

    Lacking of quantitative experimental data and/or kinetic models that could mathematically depict the redox chemistry and the crystallization issue, bottom-to-up formation kinetics of gold nanoparticles (GNPs) remains a challenge. We measured the dynamic regime of GNPs synthesized by l-ascorbic acid (representing a chemical approach) and/or foliar aqueous extract (a biogenic approach) via in situ spectroscopic characterization and established a redox-crystallization model which allows quantitative and separate parameterization of the nucleation and growth processes. The main results were simplified as the following aspects: (I) an efficient approach, i.e., the dynamic in situ spectroscopic characterization assisted with the redox-crystallization model, was established for quantitative analysis of the overall formation kinetics of GNPs in solution; (II) formation of GNPs by the chemical and the biogenic approaches experienced a slow nucleation stage followed by a growth stage which behaved as a mixed-order reaction, and different from the chemical approach, the biogenic method involved heterogeneous nucleation; (III) also, biosynthesis of flaky GNPs was a kinetic-controlled process favored by relatively slow redox chemistry; and (IV) though GNPs formation consists of two aspects, namely the redox chemistry and the crystallization issue, the latter was the rate-determining event that controls the dynamic regime of the whole physicochemical process.

  7. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  8. Quantitative analysis of uncertainty from pebble flow in HTR

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Hao, E-mail: haochen.heu@163.com [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China); Fu, Li; Jiong, Guo; Ximing, Sun; Lidong, Wang [Institute of Nuclear and New Energy Technology (INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China)

    2015-12-15

    Highlights: • An uncertainty and sensitivity analysis model for pebble flow has been built. • Experiment and random walk theory are used to identify uncertainty of pebble flow. • Effects of pebble flow to the core parameters are identified by sensitivity analysis. • Uncertainty of core parameters due to pebble flow is quantified for the first time. - Abstract: In pebble bed HTR, along the deterministic average flow lines, randomness exists in the flow of pebbles, which is not possible to simulate with the current reactor design codes for HTR, such as VSOP, due to the limitation of current computer capability. In order to study how the randomness of pebble flow will affect the key parameters in HTR, a new pebble flow model was set up, which has been successfully transplanted into the VSOP code. In the new pebble flow model, mixing coefficients were introduced into the fixed flow line to simulate the randomness of pebble flow. Numerical simulation and pebble flow experiments were facilitated to determine the mixing coefficients. Sensitivity analysis was conducted to achieve the conclusion that the key parameters of pebble bed HTR are not sensitive to the randomness in pebble flow. The uncertainty of maximum power density and power distribution caused by the randomness in pebble flow is very small, especially for the “multi-pass” scheme of fuel circulation adopted in the pebble bed HTR.

  9. The Quantitative Analysis to Inferior Oil with Electronic Nose Based on Adaptive Multilayer Stochastic Resonance

    Directory of Open Access Journals (Sweden)

    Hong Men

    2011-09-01

    Full Text Available This study makes the three acryl glycerin polymers, oxidation three acryl glycerins, and low carbon number fatty acid as inferior oil feature index. Using double steady state stochastic resonance signal-to-noise ratio analysis methods make the quantitative analysis to inferior oil. This paper analyzes the stochastic resonance. Introduces the principle detection system structure based on adaptive multilayer stochastic resonance algorithm in inferior oil quantitativeanalysis; and make adaptive double stochastic resonance model and inferior oil as example, give the simulation and numerical analysis of this model of the system. The results show that the system can obtain more accurate quality the proportion of the inferior oil information. At the same time, this method can effectively solve the semiconductor gas sensors of the baseline drift problem. The method of stochastic resonance has a lot of application prospect in improving the system performance.

  10. Quantitative Analysis of Micro-CT Imaging and Histopathological Signatures of Experimental Arthritis in Rats

    Directory of Open Access Journals (Sweden)

    Matthew D. Silva

    2004-10-01

    Full Text Available Micro-computed tomographic (micro-CT imaging provides a unique opportunity to capture 3-D architectural information in bone samples. In this study of pathological joint changes in a rat model of adjuvant-induced arthritis (AA, quantitative analysis of bone volume and roughness were performed by micro-CT imaging and compared with histopathology methods and paw swelling measurement. Micro-CT imaging of excised rat hind paws (n = 10 stored in formalin consisted of approximately 600 30-μm slices acquired on a 512 × 512 image matrix with isotropic resolution. Following imaging, the joints were scored from H&E stained sections for cartilage/bone erosion, pannus development, inflammation, and synovial hyperplasia. From micro-CT images, quantitative analysis of absolute bone volumes and bone roughness was performed. Bone erosion in the rat AA model is substantial, leading to a significant decline in tarsal volume (27%. The result of the custom bone roughness measurement indicated a 55% increase in surface roughness. Histological and paw volume analyses also demonstrated severe arthritic disease as compared to controls. Statistical analyses indicate correlations among bone volume, roughness, histology, and paw volume. These data demonstrate that the destructive progression of disease in a rat AA model can be quantified using 3-D micro-CT image analysis, which allows assessment of arthritic disease status and efficacy of experimental therapeutic agents.

  11. Modeling the Effect of Polychromatic Light in Quantitative Absorbance Spectroscopy

    Science.gov (United States)

    Smith, Rachel; Cantrell, Kevin

    2007-01-01

    Laboratory experiment is conducted to give the students practical experience with the principles of electronic absorbance spectroscopy. This straightforward approach creates a powerful tool for exploring many of the aspects of quantitative absorbance spectroscopy.

  12. Quantitative analysis of virgin coconut oil in cream cosmetics preparations using fourier transform infrared (FTIR) spectroscopy.

    Science.gov (United States)

    Rohman, A; Man, Yb Che; Sismindari

    2009-10-01

    Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.

  13. A QUANTITATIVE ANALYSIS OF HANDOVER TIME AT MAC LAYER FOR WIRELESS MOBILE NETWORKS

    Directory of Open Access Journals (Sweden)

    Syed S. Rizvi

    2009-11-01

    Full Text Available Extensive studies have been carried out for reducing the handover time of wireless mobile network at medium access control (MAC layer. However, none of them show the impact of reduced handover timeon the overall performance of wireless mobile networks. This paper presents a quantitative analysis to show the impact of reduced handover time on the performance of wireless mobile networks. The proposed quantitative model incorporates many critical performance parameters involve in reducing the handover time for wireless mobile networks. In addition, we analyze the use of active scanning technique with comparatively shorter beacon interval time in a handoff process. Our experiments verify that the activescanning can reduce the overall handover time at MAC layer if comparatively shorter beacon intervals are utilized for packet transmission. The performance measures adopted in this paper for experimental verifications are network throughput under different network loads.

  14. A quantitative analysis of Salmonella Typhimurium metabolism during infection

    OpenAIRE

    Steeb, Benjamin

    2012-01-01

    In this thesis, Salmonella metabolism during infection was investigated. The goal was to gain a quantitative and comprehensive understanding of Salmonella in vivo nutrient supply, utilization and growth. To achieve this goal, we used a combined experimental / in silico approach. First, we generated a reconstruction of Salmonella metabolism ([1], see 2.1). This reconstruction was then combined with in vivo data from experimental mutant phenotypes to build a comprehensive quantitative in viv...

  15. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, I.; Perin, Y.; Velkov, K. [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  16. Quantitative analysis of flavanones and chalcones from willow bark.

    Science.gov (United States)

    Freischmidt, A; Untergehrer, M; Ziegler, J; Knuth, S; Okpanyi, S; Müller, J; Kelber, O; Weiser, D; Jürgenliemk, G

    2015-09-01

    Willow bark extracts are used for the treatment of fever, pain and inflammation. Recent clinical and pharmacological research revealed that not only the salicylic alcohol derivatives, but also the polyphenols significantly contribute to these effects. Quantitative analysis of the European Pharmacopoeia still focuses on the determination of the salicylic alcohol derivatives. The objective of the present study was the development of an effective quantification method for the determination of as many flavanone and chalcone glycosides as possible in Salix purpurea and other Salix species as well as commercial preparations thereof. As Salix species contain a diverse spectrum of the glycosidated flavanones naringenin, eriodictyol, and the chalcone chalconaringenin, a subsequent acidic and enzymatic hydrolysis was developed to yield naringenin and eriodictyol as aglycones, which were quantified by HPLC. The 5-O-glucosides were cleaved with 11.5% TFA before subsequent hydrolysis of the 7-O-glucosides with an almond β-glucosidase at pH 6-7. The method was validated with regard to LOD, LOQ, intraday and interday precision, accuracy, stability, recovery, time of hydrolysis, robustness and applicability to extracts. All 5-O- and 7-O-glucosides of naringenin, eriodictyol and chalconaringenin were completely hydrolysed and converted to naringenin and eriodictyol. The LOD of the HPLC method was 0.77 μM of naringenin and 0.45 μM of eriodictyol. The LOQ was 2.34 μM of naringenin and 1.35 μM for eriodictyol. The method is robust with regard to sample weight, but susceptible concerning enzyme deterioration. The developed method is applicable to the determination of flavanone and chalcone glycosides in willow bark and corresponding preparations.

  17. Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.

    Science.gov (United States)

    Meyr, Andrew J; Myers, Adam; Pontious, Jane

    2014-01-01

    Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction.

  18. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  19. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  20. Quantitative analysis of the individual dynamics of Psychology theses

    Directory of Open Access Journals (Sweden)

    Robles, Jaime R.

    2009-12-01

    Full Text Available Three cohorts of undergraduate psychology theses (n = 57 performed by last year undergraduate psychology students from Universidad Católica Andrés Bello, were monitored using 5 longitudinal measurements of progression. A Generalized Additive Model, to predict the completion time of the theses, is tested against two completion times: early and delayed. Effect size measures favor a multiple dimension model over a global progress model. The trajectory of the indicators through the 5 measurements allows the differentiation between early and delayed completion. The completion probabilities estimated by the dimensional model allow the identification of differential oscillation levels for the distinct completion times. The initial progression indicators allow the prediction of early completion with a 71% success rate, while the final measurement shows a success rate of 89%. The results support the effectiveness of the supervisory system and the analysis of the progression dynamics of the theses from a task-delay model, focused on the relationship between the amount of task completion and the deadlines

  1. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  2. Herd immunity and pneumococcal conjugate vaccine: a quantitative model.

    Science.gov (United States)

    Haber, Michael; Barskey, Albert; Baughman, Wendy; Barker, Lawrence; Whitney, Cynthia G; Shaw, Kate M; Orenstein, Walter; Stephens, David S

    2007-07-20

    Invasive pneumococcal disease in older children and adults declined markedly after introduction in 2000 of the pneumococcal conjugate vaccine for young children. An empirical quantitative model was developed to estimate the herd (indirect) effects on the incidence of invasive disease among persons >or=5 years of age induced by vaccination of young children with 1, 2, or >or=3 doses of the pneumococcal conjugate vaccine, Prevnar (PCV7), containing serotypes 4, 6B, 9V, 14, 18C, 19F and 23F. From 1994 to 2003, cases of invasive pneumococcal disease were prospectively identified in Georgia Health District-3 (eight metropolitan Atlanta counties) by Active Bacterial Core surveillance (ABCs). From 2000 to 2003, vaccine coverage levels of PCV7 for children aged 19-35 months in Fulton and DeKalb counties (of Atlanta) were estimated from the National Immunization Survey (NIS). Based on incidence data and the estimated average number of doses received by 15 months of age, a Poisson regression model was fit, describing the trend in invasive pneumococcal disease in groups not targeted for vaccination (i.e., adults and older children) before and after the introduction of PCV7. Highly significant declines in all the serotypes contained in PCV7 in all unvaccinated populations (5-19, 20-39, 40-64, and >64 years) from 2000 to 2003 were found under the model. No significant change in incidence was seen from 1994 to 1999, indicating rates were stable prior to vaccine introduction. Among unvaccinated persons 5+ years of age, the modeled incidence of disease caused by PCV7 serotypes as a group dropped 38.4%, 62.0%, and 76.6% for 1, 2, and 3 doses, respectively, received on average by the population of children by the time they are 15 months of age. Incidence of serotypes 14 and 23F had consistent significant declines in all unvaccinated age groups. In contrast, the herd immunity effects on vaccine-related serotype 6A incidence were inconsistent. Increasing trends of non

  3. Linking antisocial behavior, substance use, and personality: an integrative quantitative model of the adult externalizing spectrum.

    Science.gov (United States)

    Krueger, Robert F; Markon, Kristian E; Patrick, Christopher J; Benning, Stephen D; Kramer, Mark D

    2007-11-01

    Antisocial behavior, substance use, and impulsive and aggressive personality traits often co-occur, forming a coherent spectrum of personality and psychopathology. In the current research, the authors developed a novel quantitative model of this spectrum. Over 3 waves of iterative data collection, 1,787 adult participants selected to represent a range across the externalizing spectrum provided extensive data about specific externalizing behaviors. Statistical methods such as item response theory and semiparametric factor analysis were used to model these data. The model and assessment instrument that emerged from the research shows how externalizing phenomena are organized hierarchically and cover a wide range of individual differences. The authors discuss the utility of this model for framing research on the correlates and the etiology of externalizing phenomena.

  4. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  5. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    Science.gov (United States)

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  6. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Directory of Open Access Journals (Sweden)

    Aino eSalminen

    2015-10-01

    Full Text Available Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary A. actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥ 6 mm pockets, and alveolar bone loss (ABL. High level of T. forsythia was associated also with bleeding on probing (BOP. The combination of the four bacteria, i.e. the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR of 2.40 (95% CI 1.39–4.13. When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52. The highest odds ratio 3.59 (95% CI 1.94–6.63 was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary

  7. A Quantitative Model-Driven Comparison of Command Approaches in an Adversarial Process Model

    Science.gov (United States)

    2007-06-01

    12TH ICCRTS “Adapting C2 to the 21st Century” A Quantitative Model-Driven Comparison of Command Approaches in an Adversarial Process Model Tracks...Lenahan2 identified metrics and techniques for adversarial C2 process modeling . We intend to further that work by developing a set of adversarial process ...Approaches in an Adversarial Process Model 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  8. Trophic relationships in an estuarine environment: A quantitative fatty acid analysis signature approach

    Science.gov (United States)

    Magnone, Larisa; Bessonart, Martin; Gadea, Juan; Salhi, María

    2015-12-01

    In order to better understand the functioning of aquatic environments, it is necessary to obtain accurate diet estimations in food webs. Their description should incorporate information about energy flow and the relative importance of trophic pathways. Fatty acids have been extensively used in qualitative studies on trophic relationships in food webs. Recently a new method to estimate quantitatively single predator diet has been developed. In this study, a model of aquatic food web through quantitative fatty acid signature analysis was generated to identify the trophic interactions among the species in the Rocha Lagoon. The biological sampling over two consecutive annual periods was comprehensive enough to identify all functional groups in the aquatic food web (except birds and mammals). Heleobia australis seemed to play a central role in this estuarine ecosystem. As both, a grazer and a prey to several other species, probably H. australis is transferring a great amount of energy to upper trophic levels. Most of the species at Rocha Lagoon have a wide range of prey items in their diet reflecting a complex food web, which is characteristic of extremely dynamic environment as estuarine ecosystems. QFASA is a model in tracing and quantitative estimate trophic pathways among species in an estuarine food web. The results obtained in the present work are a valuable contribution in the understanding of trophic relationships in Rocha Lagoon.

  9. Risco privado em infra-estrutura pública: uma análise quantitativa de risco como ferramenta de modelagem de contratos Private risk in public infrastructure: a quantitative risk analysis as a contract modeling tool

    Directory of Open Access Journals (Sweden)

    Luiz E. T. Brandão

    2007-12-01

    Full Text Available Parcerias público-privadas (PPP são arranjos contratuais onde o governo assume compromissos futuros por meio de garantias e opções. São alternativas para aumentar a eficiência do Estado por uma alocação mais eficiente de incentivos e riscos. No entanto, a determinação do nível ótimo de garantias e a própria alocação de riscos são geralmente realizadas de forma subjetiva, podendo levar o governo a ter que assumir passivos significativos. Este artigo propõe um modelo de valoração quantitativa de garantias governamentais em projetos de PPP por meio da metodologia das opções reais, e este modelo é aplicado a um projeto de concessão rodoviária. Os autores analisam o impacto de diversos níveis de garantia de receita sobre o valor e risco do projeto, bem como o valor esperado do desembolso futuro do governo em cada uma das situações, concluindo que é possível ao poder público determinar o nível ótimo de garantia em função do grau de redução de risco desejado, e que o desenho e a modelagem contratual de projetos de PPP podem se beneficiar de ferramentas quantitativas aqui apresentadas.Public private partnerships (PPP are contractual arrangements in which the government assumes future obligations by providing project guarantees. They are considered a way of increasing government efficiency through a more efficient allocation of risks and incentives. On the other hand, the assessment and determination the optimal level of these guarantees is usually subjective, exposing the government t