WorldWideScience

Sample records for universal quantitative models

  1. Rotating universe models

    International Nuclear Information System (INIS)

    Tozini, A.V.

    1984-01-01

    A review is made of some properties of the rotating Universe models. Godel's model is identified as a generalized filted model. Some properties of new solutions of the Einstein's equations, which are rotating non-stationary Universe models, are presented and analyzed. These models have the Godel's model as a particular case. Non-stationary cosmological models are found which are a generalization of the Godel's metrics in an analogous way in which Friedmann is to the Einstein's model. (L.C.) [pt

  2. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  3. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  4. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  5. Models of the universe

    International Nuclear Information System (INIS)

    Dirac, P.A.M.

    1981-01-01

    Most models of the universe are dependent on the assumption of a uniform distribution of matter, and thus are rather crude, due to the nonlinear nature of Einstein's field equations. Here, a model is proposed which avoids this smoothing-out process. A metric is obtained which is consistent with the assumption that the matter of the universe is concentrated mainly in stars, moving with the velocity of recession implied by Hubble's law. The solution obtained gives results comparable to those obtainable by Schwarzchild metric, suitably adjusted to agree with the Einstein-DeSitter model at large distances

  6. Chaotic universe model.

    Science.gov (United States)

    Aydiner, Ekrem

    2018-01-15

    In this study, we consider nonlinear interactions between components such as dark energy, dark matter, matter and radiation in the framework of the Friedman-Robertson-Walker space-time and propose a simple interaction model based on the time evolution of the densities of these components. By using this model we show that these interactions can be given by Lotka-Volterra type equations. We numerically solve these coupling equations and show that interaction dynamics between dark energy-dark matter-matter or dark energy-dark matter-matter-radiation has a strange attractor for 0 > w de  >-1, w dm  ≥ 0, w m  ≥ 0 and w r  ≥ 0 values. These strange attractors with the positive Lyapunov exponent clearly show that chaotic dynamics appears in the time evolution of the densities. These results provide that the time evolution of the universe is chaotic. The present model may have potential to solve some of the cosmological problems such as the singularity, cosmic coincidence, big crunch, big rip, horizon, oscillation, the emergence of the galaxies, matter distribution and large-scale organization of the universe. The model also connects between dynamics of the competing species in biological systems and dynamics of the time evolution of the universe and offers a new perspective and a new different scenario for the universe evolution.

  7. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  8. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  9. Quantitative structure - mesothelioma potency model ...

    Science.gov (United States)

    Cancer potencies of mineral and synthetic elongated particle (EP) mixtures, including asbestos fibers, are influenced by changes in fiber dose composition, bioavailability, and biodurability in combination with relevant cytotoxic dose-response relationships. A unique and comprehensive rat intra-pleural (IP) dose characterization data set with a wide variety of EP size, shape, crystallographic, chemical, and bio-durability properties facilitated extensive statistical analyses of 50 rat IP exposure test results for evaluation of alternative dose pleural mesothelioma response models. Utilizing logistic regression, maximum likelihood evaluations of thousands of alternative dose metrics based on hundreds of individual EP dimensional variations within each test sample, four major findings emerged: (1) data for simulations of short-term EP dose changes in vivo (mild acid leaching) provide superior predictions of tumor incidence compared to non-acid leached data; (2) sum of the EP surface areas (ÓSA) from these mildly acid-leached samples provides the optimum holistic dose response model; (3) progressive removal of dose associated with very short and/or thin EPs significantly degrades resultant ÓEP or ÓSA dose-based predictive model fits, as judged by Akaike’s Information Criterion (AIC); and (4) alternative, biologically plausible model adjustments provide evidence for reduced potency of EPs with length/width (aspect) ratios 80 µm. Regar

  10. Quantitative Literacy at Michigan State University, 2: Connection to Financial Literacy

    Directory of Open Access Journals (Sweden)

    Dennis Gilliland

    2011-07-01

    Full Text Available The lack of capability of making financial decisions has been recently described for the adult United States population. A concerted effort to increase awareness of this crisis, to improve education in quantitative and financial literacy, and to simplify financial decision-making processes is critical to the solution. This paper describes a study that was undertaken to explore the relationship between quantitative literacy and financial literacy for entering college freshmen. In summer 2010, incoming freshmen to Michigan State University were assessed. Well-tested financial literacy items and validated quantitative literacy assessment instruments were administered to 531 subjects. Logistic regression models were used to assess the relationship between level of financial literacy and independent variables including quantitative literacy score, ACT mathematics score, and demographic variables including gender. The study establishes a strong positive association between quantitative literacy and financial literacy on top of the effects of the other independent variables. Adding one percent to the performance on a quantitative literacy assessment changes the odds for being at the highest level of financial literacy by a factor estimated to be 1.05. Gender is found to have a large, statistically significant effect as well with being female changing the odds by a factor estimated to be 0.49.

  11. Hopping models and ac universality

    DEFF Research Database (Denmark)

    Dyre, Jeppe; Schrøder, Thomas

    2002-01-01

    Some general relations for hopping models are established. We proceed to discuss the universality of the ac conductivity which arises in the extreme disorder limit of the random barrier model. It is shown that the relevant dimension entering into the diffusion cluster approximation (DCA) is the h......Some general relations for hopping models are established. We proceed to discuss the universality of the ac conductivity which arises in the extreme disorder limit of the random barrier model. It is shown that the relevant dimension entering into the diffusion cluster approximation (DCA......) is the harmonic (fracton) dimension of the diffusion cluster. The temperature scaling of the dimensionless frequency entering into the DCA is discussed. Finally, some open problems regarding ac universality are listed....

  12. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  13. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  14. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  15. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  16. Cosmic strings in an open universe: Quantitative evolution and observational consequences

    International Nuclear Information System (INIS)

    Avelino, P.P.; Caldwell, R.R.; Martins, C.J.

    1997-01-01

    The cosmic string scenario in an open universe is developed - including the equations of motion, a model of network evolution, the large angular scale cosmic microwave background (CMB) anisotropy, and the power spectrum of density fluctuations produced by cosmic strings with dark matter. We first derive the equations of motion for a cosmic string in an open Friedmann-Robertson-Walker (FRW) space-time. With these equations and the cosmic string stress-energy conservation law, we construct a quantitative model of the evolution of the gross features of a cosmic string network in a dust-dominated, Ω 2 /Mpc. In a low density universe the string+CDM scenario is a better model for structure formation. We find that for cosmological parameters Γ=Ωh∼0.1 - 0.2 in an open universe the string+CDM power spectrum fits the shape of the linear power spectrum inferred from various galaxy surveys. For Ω∼0.2 - 0.4, the model requires a bias b approx-gt 2 in the variance of the mass fluctuation on scales 8h -1 Mpc. In the presence of a cosmological constant, the spatially flat string+CDM power spectrum requires a slightly lower bias than for an open universe of the same matter density. copyright 1997 The American Physical Society

  17. Universality of projectile fragmentation model

    International Nuclear Information System (INIS)

    Chaudhuri, G.; Mallik, S.; Das Gupta, S.

    2012-01-01

    Presently projectile fragmentation reaction is an important area of research as it is used for the production of radioactive ion beams. In this work, the recently developed projectile fragmentation model with an universal temperature profile is used for studying the charge distributions of different projectile fragmentation reactions with different projectile target combinations at different incident energies. The model for projectile fragmentation consists of three stages: (i) abrasion, (ii) multifragmentation and (iii) evaporation

  18. Quantitative Analysis of Criteria in University Building Maintenance in Malaysia

    Directory of Open Access Journals (Sweden)

    Olanrewaju Ashola Abdul-Lateef

    2010-10-01

    Full Text Available University buildings are a significant part of university assets and considerable resources are committed to their design, construction and maintenance. The core of maintenance management is to optimize productivity and user satisfaction with optimum resources. An important segment in the maintenance management system is the analysis of criteria that influence building maintenance. Therefore, this paper aims to identify quantify, rank and discuss the criteria that influence maintenance costs, maintenance backlogs, productivity and user satisfaction in Malaysian university buildings. The paper reviews the related literature and presents the outcomes of a questionnaire survey. Questionnaires were administered on 50 university maintenance organizations. Thirty-one criteria were addressed to the university maintenance organizations to evaluate the degree to which each of the criteria influences building maintenance management. With a 66% response rate, it was concluded that the consideration of the criteria is critical to the university building maintenance management system. The quality of components and materials, budget constraints and the age of the building were found to be the most influential criteria but information on user performance satisfaction, problems associated with in-house workforce and shortage of materials and components were the least influential criteria. The paper also outlined that maintenance management is a strategic function in university administration.

  19. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  20. Warm anisotropic inflationary universe model

    International Nuclear Information System (INIS)

    Sharif, M.; Saleem, Rabia

    2014-01-01

    This paper is devoted to the study of warm inflation using vector fields in the background of a locally rotationally symmetric Bianchi type I model of the universe. We formulate the field equations, and slow-roll and perturbation parameters (scalar and tensor power spectra as well as their spectral indices) in the slow-roll approximation. We evaluate all these parameters in terms of the directional Hubble parameter during the intermediate and logamediate inflationary regimes by taking the dissipation factor as a function of the scalar field as well as a constant. In each case, we calculate the observational parameter of interest, i.e., the tensor-scalar ratio in terms of the inflaton. The graphical behavior of these parameters shows that the anisotropic model is also compatible with WMAP7 and the Planck observational data. (orig.)

  1. Warm anisotropic inflationary universe model

    Energy Technology Data Exchange (ETDEWEB)

    Sharif, M.; Saleem, Rabia [University of the Punjab, Department of Mathematics, Lahore (Pakistan)

    2014-02-15

    This paper is devoted to the study of warm inflation using vector fields in the background of a locally rotationally symmetric Bianchi type I model of the universe. We formulate the field equations, and slow-roll and perturbation parameters (scalar and tensor power spectra as well as their spectral indices) in the slow-roll approximation. We evaluate all these parameters in terms of the directional Hubble parameter during the intermediate and logamediate inflationary regimes by taking the dissipation factor as a function of the scalar field as well as a constant. In each case, we calculate the observational parameter of interest, i.e., the tensor-scalar ratio in terms of the inflaton. The graphical behavior of these parameters shows that the anisotropic model is also compatible with WMAP7 and the Planck observational data. (orig.)

  2. A universal multilingual weightless neural network tagger via quantitative linguistics.

    Science.gov (United States)

    Carneiro, Hugo C C; Pedreira, Carlos E; França, Felipe M G; Lima, Priscila M V

    2017-07-01

    In the last decade, given the availability of corpora in several distinct languages, research on multilingual part-of-speech tagging started to grow. Amongst the novelties there is mWANN-Tagger (multilingual weightless artificial neural network tagger), a weightless neural part-of-speech tagger capable of being used for mostly-suffix-oriented languages. The tagger was subjected to corpora in eight languages of quite distinct natures and had a remarkable accuracy with very low sample deviation in every one of them, indicating the robustness of weightless neural systems for part-of-speech tagging tasks. However, mWANN-Tagger needed to be tuned for every new corpus, since each one required a different parameter configuration. For mWANN-Tagger to be truly multilingual, it should be usable for any new language with no need of parameter tuning. This article proposes a study that aims to find a relation between the lexical diversity of a language and the parameter configuration that would produce the best performing mWANN-Tagger instance. Preliminary analyses suggested that a single parameter configuration may be applied to the eight aforementioned languages. The mWANN-Tagger instance produced by this configuration was as accurate as the language-dependent ones obtained through tuning. Afterwards, the weightless neural tagger was further subjected to new corpora in languages that range from very isolating to polysynthetic ones. The best performing instances of mWANN-Tagger are again the ones produced by the universal parameter configuration. Hence, mWANN-Tagger can be applied to new corpora with no need of parameter tuning, making it a universal multilingual part-of-speech tagger. Further experiments with Universal Dependencies treebanks reveal that mWANN-Tagger may be extended and that it has potential to outperform most state-of-the-art part-of-speech taggers if better word representations are provided. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  4. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  5. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  6. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  7. Quantifying Zika: Advancing the Epidemiology of Zika With Quantitative Models.

    Science.gov (United States)

    Keegan, Lindsay T; Lessler, Justin; Johansson, Michael A

    2017-12-16

    When Zika virus (ZIKV) emerged in the Americas, little was known about its biology, pathogenesis, and transmission potential, and the scope of the epidemic was largely hidden, owing to generally mild infections and no established surveillance systems. Surges in congenital defects and Guillain-Barré syndrome alerted the world to the danger of ZIKV. In the context of limited data, quantitative models were critical in reducing uncertainties and guiding the global ZIKV response. Here, we review some of the models used to assess the risk of ZIKV-associated severe outcomes, the potential speed and size of ZIKV epidemics, and the geographic distribution of ZIKV risk. These models provide important insights and highlight significant unresolved questions related to ZIKV and other emerging pathogens. Published by Oxford University Press for the Infectious Diseases Society of America 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  8. Sustaining Community-University Collaborations: The Durham University Model

    Directory of Open Access Journals (Sweden)

    Andrew Russell

    2011-11-01

    Full Text Available Durham University has initiated a community outreach and engagement program based on an evolving multifaceted model. This article analyses the components of the model and looks at how our work at Durham has become increasingly embedded in the structures and processes of the university as it has developed. The strengths and weaknesses in what has been achieved are highlighted, as is the future vision for the further development of this innovative community-university program. Keywords Public engagement; community partnerships; employer supported volunteering; corporate social responsibility

  9. Model of the static universe within GR

    International Nuclear Information System (INIS)

    Karbanovski, V. V.; Tarasova, A. S.; Salimova, A. S.; Bilinskaya, G. V.; Sumbulov, A. N.

    2011-01-01

    Within GR, the problems of the Robertson-Walker universe are discussed. The approach based on transition to a nondiagonal line element is suggested. Within the considered approach, the static universe model is investigated. The possibility of constructing scenarios without an initial singularity and “exotic” matter is discussed. Accordance of the given model to the properties of the observable universe is discussed.

  10. University Administration on a Political Model.

    Science.gov (United States)

    Walker, Donald E.

    1979-01-01

    It is suggested that recognizing the university as a political community may lead to better management and organization. The patriarchal role, the president as hero, dispersed power, how the university really functions, and a political model are described. (MLW)

  11. University Satellite Campus Management Models

    Science.gov (United States)

    Fraser, Doug; Stott, Ken

    2015-01-01

    Among the 60 or so university satellite campuses in Australia are many that are probably failing to meet the high expectations of their universities and the communities they were designed to serve. While in some cases this may be due to the demand driven system, it may also be attributable in part to the ways in which they are managed. The…

  12. A Model of Nonsingular Universe

    Directory of Open Access Journals (Sweden)

    Changjun Gao

    2012-07-01

    Full Text Available In the background of Friedmann–Robertson–Walker Universe, there exists Hawking radiation which comes from the cosmic apparent horizon due to quantum effect. Although the Hawking radiation on the late time evolution of the universe could be safely neglected, it plays an important role in the very early stage of the universe. In view of this point, we identify the temperature in the scalar field potential with the Hawking temperature of cosmic apparent horizon. Then we find a nonsingular universe sourced by the temperature-dependent scalar field. We find that the universe could be created from a de Sitter phase which has the Planck energy density. Thus the Big-Bang singularity is avoided.

  13. Black Hole Universe Model and Dark Energy

    Science.gov (United States)

    Zhang, Tianxi

    2011-01-01

    Considering black hole as spacetime and slightly modifying the big bang theory, the author has recently developed a new cosmological model called black hole universe, which is consistent with Mach principle and Einsteinian general relativity and self consistently explains various observations of the universe without difficulties. According to this model, the universe originated from a hot star-like black hole and gradually grew through a supermassive black hole to the present universe by accreting ambient material and merging with other black holes. The entire space is infinitely and hierarchically layered and evolves iteratively. The innermost three layers are the universe that we lives, the outside space called mother universe, and the inside star-like and supermassive black holes called child universes. The outermost layer has an infinite radius and zero limits for both the mass density and absolute temperature. All layers or universes are governed by the same physics, the Einstein general relativity with the Robertson-Walker metric of spacetime, and tend to expand outward physically. When one universe expands out, a new similar universe grows up from its inside black holes. The origin, structure, evolution, expansion, and cosmic microwave background radiation of black hole universe have been presented in the recent sequence of American Astronomical Society (AAS) meetings and published in peer-review journals. This study will show how this new model explains the acceleration of the universe and why dark energy is not required. We will also compare the black hole universe model with the big bang cosmology.

  14. Virtual Universities: Current Models and Future Trends.

    Science.gov (United States)

    Guri-Rosenblit, Sarah

    2001-01-01

    Describes current models of distance education (single-mode distance teaching universities, dual- and mixed-mode universities, extension services, consortia-type ventures, and new technology-based universities), including their merits and problems. Discusses future trends in potential student constituencies, faculty roles, forms of knowledge…

  15. Virtual Models of European Universities

    DEFF Research Database (Denmark)

    Pedersen, Sanya Gertsen

    2003-01-01

    The study provides a detailed report on the current and possible future use of ICT by European universities for educational and organisational purposes. The report presents: • A general description of the current situation regarding the use of ICT in EU universities in both the educational...... and the organisational setting. • An in-depth study of selected institutions through case studies. • A future-oriented analysis. • A set of recommendations for future action....

  16. Quantitative histological models suggest endothermy in plesiosaurs

    Directory of Open Access Journals (Sweden)

    Corinna V. Fleischle

    2018-06-01

    Full Text Available Background Plesiosaurs are marine reptiles that arose in the Late Triassic and survived to the Late Cretaceous. They have a unique and uniform bauplan and are known for their very long neck and hydrofoil-like flippers. Plesiosaurs are among the most successful vertebrate clades in Earth’s history. Based on bone mass decrease and cosmopolitan distribution, both of which affect lifestyle, indications of parental care, and oxygen isotope analyses, evidence for endothermy in plesiosaurs has accumulated. Recent bone histological investigations also provide evidence of fast growth and elevated metabolic rates. However, quantitative estimations of metabolic rates and bone growth rates in plesiosaurs have not been attempted before. Methods Phylogenetic eigenvector maps is a method for estimating trait values from a predictor variable while taking into account phylogenetic relationships. As predictor variable, this study employs vascular density, measured in bone histological sections of fossil eosauropterygians and extant comparative taxa. We quantified vascular density as primary osteon density, thus, the proportion of vascular area (including lamellar infillings of primary osteons to total bone area. Our response variables are bone growth rate (expressed as local bone apposition rate and resting metabolic rate (RMR. Results Our models reveal bone growth rates and RMRs for plesiosaurs that are in the range of birds, suggesting that plesiosaurs were endotherm. Even for basal eosauropterygians we estimate values in the range of mammals or higher. Discussion Our models are influenced by the availability of comparative data, which are lacking for large marine amniotes, potentially skewing our results. However, our statistically robust inference of fast growth and fast metabolism is in accordance with other evidence for plesiosaurian endothermy. Endothermy may explain the success of plesiosaurs consisting in their survival of the end-Triassic extinction

  17. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    Science.gov (United States)

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  18. Emergent universe model with dissipative effects

    Science.gov (United States)

    Debnath, P. S.; Paul, B. C.

    2017-12-01

    Emergent universe model is presented in general theory of relativity with isotropic fluid in addition to viscosity. We obtain cosmological solutions that permit emergent universe scenario in the presence of bulk viscosity that are described by either Eckart theory or Truncated Israel Stewart (TIS) theory. The stability of the solutions are also studied. In this case, the emergent universe (EU) model is analyzed with observational data. In the presence of viscosity, one obtains emergent universe scenario, which however is not permitted in the absence of viscosity. The EU model is compatible with cosmological observations.

  19. The Offshore Model for Universities

    Science.gov (United States)

    Ross, Andrew

    2008-01-01

    This article discusses the ongoing effort of the World Trade Organization (WTO) to bring higher education services within the purview of the General Agreement on Trade and Services (GATS). One result of the anticipated liberalization of trade in education, the author explains, is the headlong rush of Anglophone universities into the global market…

  20. University Students' Meta-Modelling Knowledge

    Science.gov (United States)

    Krell, Moritz; Krüger, Dirk

    2017-01-01

    Background: As one part of scientific meta-knowledge, students' meta-modelling knowledge should be promoted on different educational levels such as primary school, secondary school and university. This study focuses on the assessment of university students' meta-modelling knowledge using a paper-pencil questionnaire. Purpose: The general purpose…

  1. Modeling conflict : research methods, quantitative modeling, and lessons learned.

    Energy Technology Data Exchange (ETDEWEB)

    Rexroth, Paul E.; Malczynski, Leonard A.; Hendrickson, Gerald A.; Kobos, Peter Holmes; McNamara, Laura A.

    2004-09-01

    This study investigates the factors that lead countries into conflict. Specifically, political, social and economic factors may offer insight as to how prone a country (or set of countries) may be for inter-country or intra-country conflict. Largely methodological in scope, this study examines the literature for quantitative models that address or attempt to model conflict both in the past, and for future insight. The analysis concentrates specifically on the system dynamics paradigm, not the political science mainstream approaches of econometrics and game theory. The application of this paradigm builds upon the most sophisticated attempt at modeling conflict as a result of system level interactions. This study presents the modeling efforts built on limited data and working literature paradigms, and recommendations for future attempts at modeling conflict.

  2. The Toy model: Understanding the early universe

    Science.gov (United States)

    Fisher, Peter H.; Price, Richard H.

    2018-04-01

    In many branches of science, progress is being made by taking advantage of insights from other branches of science. Cosmology, the structure and evolution of the universe, is certainly an area that is currently beset by problems in understanding. We show here that the scientific insights from the studies of early childhood development, in particular, those of Piaget, give a new way of looking at the early universe. This new approach can not only be invaluable in undergraduate teaching, but can even be the basis of semi-quantitative predictions.

  3. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  4. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  5. A New Cosmological Model: Black Hole Universe

    Directory of Open Access Journals (Sweden)

    Zhang T. X.

    2009-07-01

    Full Text Available A new cosmological model called black hole universe is proposed. According to this model, the universe originated from a hot star-like black hole with several solar masses, and gradually grew up through a supermassive black hole with billion solar masses to the present state with hundred billion-trillion solar masses by accreting ambient mate- rials and merging with other black holes. The entire space is structured with infinite layers hierarchically. The innermost three layers are the universe that we are living, the outside called mother universe, and the inside star-like and supermassive black holes called child universes. The outermost layer is infinite in radius and limits to zero for both the mass density and absolute temperature. The relationships among all layers or universes can be connected by the universe family tree. Mathematically, the entire space can be represented as a set of all universes. A black hole universe is a subset of the en- tire space or a subspace. The child universes are null sets or empty spaces. All layers or universes are governed by the same physics - the Einstein general theory of relativity with the Robertson-walker metric of spacetime - and tend to expand outward physically. The evolution of the space structure is iterative. When one universe expands out, a new similar universe grows up from its inside. The entire life of a universe begins from the birth as a hot star-like or supermassive black hole, passes through the growth and cools down, and expands to the death with infinite large and zero mass density and absolute temperature. The black hole universe model is consistent with the Mach principle, the observations of the universe, and the Einstein general theory of relativity. Its various aspects can be understood with the well-developed physics without any difficulty. The dark energy is not required for the universe to accelerate its expansion. The inflation is not necessary because the black hole universe

  6. The thermal evolution of universe: standard model

    International Nuclear Information System (INIS)

    Nascimento, L.C.S. do.

    1975-08-01

    A description of the dynamical evolution of the Universe following a model based on the theory of General Relativity is made. The model admits the Cosmological principle,the principle of Equivalence and the Robertson-Walker metric (of which an original derivation is presented). In this model, the universe is considered as a perfect fluid, ideal and symmetric relatively to the number of particles and antiparticles. The thermodynamic relations deriving from these hypothesis are derived, and from them the several eras of the thermal evolution of the universe are established. Finally, the problems arising from certain specific predictions of the model are studied, and the predictions of the abundances of the elements according to nucleosynthesis and the actual behavior of the universe are analysed in detail. (author) [pt

  7. Reuleaux models at St. Petersburg State University

    Science.gov (United States)

    Kuteeva, G. A.; Sinilshchikova, G. A.; Trifonenko, B. V.

    2018-05-01

    Franz Reuleaux (1829 - 1905) is a famous mechanical engineer, a Professor of the Berlin Royal Technical Academy. He became widely known as an engineer-scientist, a Professor and industrial consultant, education reformer and leader of the technical elite of Germany. He directed the design and manufacture of over 300 models of simple mechanisms. They were sold to many famous universities for pedagogical and scientific purposes. Today, the most complete set is at Cornell University, College of Engineering. In this article we discuss the history, the modern state and our using the Reuleaux models that survived at St. Petersburg State University for educational purposes. We present description of certain models and our electronic resource with these models. We provide the information of similar electronic resources from other universities.

  8. Developing a Model for Assessing Public Culture Indicators at Universities

    Directory of Open Access Journals (Sweden)

    Meisam Latifi

    2015-06-01

    Full Text Available The present study is aimed to develop a model for assessing public culture at universities and evaluating its indicators at public universities in Mashhad. The research follows an exploratory mixed approach. Research strategies in qualitative and quantitative sections are thematic networks analysis and descriptive- survey method, respectively. In the qualitative section, document analysis and semi-structured interviews with cultural experts are used as research tools. In this section, targeted sampling is carried out. In the quantitative section, a questionnaire which is developed based on the findings of the qualitative section is used as the research tool. Research population of the quantitative section consists of all the students who are admitted to public universities in Mashhad between 2009 and 2012. Sample size was calculated according to Cochran’s formula. Stratified sampling was used to select the sample. The results of the qualitative section led to the identification of 44 basic themes which are referred to as the micro indicators. These themes were clustered into similar groups. Then, 10 organizer themes were identified and recognized as macro indicators. In the next phase, importance factor of each indicator is determined according to the AHP method. The results of the qualitative assessment of indicators at public universities of Mashhad show that the overall cultural index declines during the years the student attends the university. Additionally, the highest correlation exists between national identity and revolutionary identity. The only negative correlations are observed between family and two indicators including social capital and cultural consumption. The results of the present study can be used to assess the state of public culture among university students and also be considered as a basis for assessing cultural planning.

  9. A universal simulator for ecological models

    DEFF Research Database (Denmark)

    Holst, Niels

    2013-01-01

    Software design is an often neglected issue in ecological models, even though bad software design often becomes a hindrance for re-using, sharing and even grasping an ecological model. In this paper, the methodology of agile software design was applied to the domain of ecological models. Thus...... the principles for a universal design of ecological models were arrived at. To exemplify this design, the open-source software Universal Simulator was constructed using C++ and XML and is provided as a resource for inspiration....

  10. The Loyalty Model of Private University Student

    Directory of Open Access Journals (Sweden)

    Leonnard

    2014-04-01

    Full Text Available This study investigates Loyalty Model of Private University Student by using STIKOM London School of Public Relation as a study case. This study examined the model from service quality, college image, price, trust and satisfaction perspective. Thus, the objective of this study is to examine and analyze the effect of service quality, college image, tuition fee, trust and satisfaction towards students’ loyalty; the effect of service quality, college image, price and satisfaction towards trust; and the effect of service quality, college image and price towards satisfaction. This study used survey methodology with causal design. The samples of the study are 320 college students. The gathering of data is conducted by using questionnaire in likert scale. The analysis of the data used a Structural Equation Model (SEM approach. The implication of this study is portraying a full contextual description of loyalty model in private university by giving an integrated and innovated contribution to Student Loyalty Model in private university..

  11. The Loyalty Model of Private University Student

    Directory of Open Access Journals (Sweden)

    Leonnard

    2014-04-01

    Full Text Available This study investigates Loyalty Model of Private University Student by using STIKOM London School of Public Relation as a study case. This study examined the model from service quality, college image, price, trust and satisfaction perspective. Thus, the objective of this study is to examine and analyze the effect of service quality, college image, tuition fee, trust and satisfaction towards students’ loyalty; the effect of service quality, college image, price and satisfaction towards trust; and the effect of service quality, college image and price towards satisfaction. This study used survey methodology with causal design. The samples of the study are 320 college students. The gathering of data is conducted by using questionnaire in likert scale. The analysis of the data used a Structural Equation Model (SEM approach. The implication of this study is portraying a full contextual description of loyalty model in private university by giving an integrated and innovated contribution to Student Loyalty Model in private university.

  12. Faster universal modeling for two source classes

    NARCIS (Netherlands)

    Nowbakht, A.; Willems, F.M.J.; Macq, B.; Quisquater, J.-J.

    2002-01-01

    The Universal Modeling algorithms proposed in [2] for two general classes of finite-context sources are reviewed. The above methods were constructed by viewing a model structure as a partition of the context space and realizing that a partition can be reached through successive splits. Here we start

  13. On the universality of the attribution-affect model of helping.

    Science.gov (United States)

    Reisenzein, Rainer

    2015-08-01

    Although Pilati et al.'s (2014) findings question the strong quantitative universality of the attribution-affect model of helping, they are consistent with a weak form of quantitative universality, as well as with the qualitative universality of the theory. However, universality is put into question by previous studies revealing significant and sizeable between-study differences in the strength of the causal paths postulated by the theory. These differences may in part reflect differences in the type of helping situations studied. © 2015 International Union of Psychological Science.

  14. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  15. Modeling with Young Students--Quantitative and Qualitative.

    Science.gov (United States)

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  16. Quantitative occupational risk model: Single hazard

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Aneziris, O.N.; Bellamy, L.J.; Ale, B.J.M.; Oh, J.

    2017-01-01

    A model for the quantification of occupational risk of a worker exposed to a single hazard is presented. The model connects the working conditions and worker behaviour to the probability of an accident resulting into one of three types of consequence: recoverable injury, permanent injury and death. Working conditions and safety barriers in place to reduce the likelihood of an accident are included. Logical connections are modelled through an influence diagram. Quantification of the model is based on two sources of information: a) number of accidents observed over a period of time and b) assessment of exposure data of activities and working conditions over the same period of time and the same working population. Effectiveness of risk reducing measures affecting the working conditions, worker behaviour and/or safety barriers can be quantified through the effect of these measures on occupational risk. - Highlights: • Quantification of occupational risk from a single hazard. • Influence diagram connects working conditions, worker behaviour and safety barriers. • Necessary data include the number of accidents and the total exposure of worker • Effectiveness of risk reducing measures is quantified through the impact on the risk • An example illustrates the methodology.

  17. A universe model confronted to observations

    International Nuclear Information System (INIS)

    Souriau, J.M.

    1982-09-01

    Present work is a detailed study of a Universe model elaborated in several steps, and some of its consequences. Absence zone in quasar spatial distribution is first described; demonstration is made it is sufficient to determine a cosmological model. Each following paragraph is concerned with a type of observation, which is confronted with the model. Universe age and density, redshift-luminosity relation for galaxies and quasars, diameter-redshift relation for radiosources, radiation isotropy at 3 0 K, matter-antimatter contact zone physics. An eventual stratification of universe parallel to this zone is more peculiarly studied; absorption lines in quasar spectra are in way interpreted, just as local super-cluster and local group of galaxies, galaxy HI region orientation, and at last neighbouring galaxy kinematics [fr

  18. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  19. Quantitative Literacy at Michigan State University, 3: Designing General Education Mathematics Courses

    Directory of Open Access Journals (Sweden)

    Samuel L. Tunstall

    2016-07-01

    Full Text Available In this paper, we describe the process at Michigan State University whereby we have created two courses, Math 101 and 102, designed to foster numeracy and alleviate mathematics anxiety. The courses--which are not sequential--provide a means of satisfying the University's general education requirement without taking college algebra or calculus, among other options. They are context-driven and broken into modules such as "The World and Its People" and "Health and Risk." They have been highly successful thus far, with students providing positive feedback on their interest in the material and the utility they see of it in their daily lives. We include background on the courses' history, their current status, and present and future challenges, ending with suggestions for others as they attempt to implement quantitative literacy courses at their own institution.

  20. The universal function in color dipole model

    Science.gov (United States)

    Jalilian, Z.; Boroun, G. R.

    2017-10-01

    In this work we review color dipole model and recall properties of the saturation and geometrical scaling in this model. Our primary aim is determining the exact universal function in terms of the introduced scaling variable in different distance than the saturation radius. With inserting the mass in calculation we compute numerically the contribution of heavy productions in small x from the total structure function by the fraction of universal functions and show the geometrical scaling is established due to our scaling variable in this study.

  1. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  2. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  3. Enhanced Tunnelling Models for Child Universe Formation

    CERN Document Server

    Ansoldi, S; Shilon, I

    2015-01-01

    Starting from a recently proposed model that allows for an enhanced rate of child universe production under generic conditions, we elaborate on refinements that may allow for non-singular initial configurations. A possibility to treat both, the initial state and the tunnelling beyond the semiclassical level will also be introduced.

  4. Modeling Environmental Literacy of University Students

    Science.gov (United States)

    Teksoz, Gaye; Sahin, Elvan; Tekkaya-Oztekin, Ceren

    2012-01-01

    The present study proposed an Environmental Literacy Components Model to explain how environmental attitudes, environmental responsibility, environmental concern, and environmental knowledge as well as outdoor activities related to each other. A total of 1,345 university students responded to an environmental literacy survey (Kaplowitz and Levine…

  5. Hidden Markov Model for quantitative prediction of snowfall

    Indian Academy of Sciences (India)

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992–2012. There are six ...

  6. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  7. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  8. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  9. The Rise and Fall of Modern Greek in Australia's Universities: What Can a Quantitative Analysis Tell Us?

    Science.gov (United States)

    Hajek, John; Nicholas, Nick

    2004-01-01

    In this article, we look at the state of Modern Greek in Australian universities, focusing on quantitative analysis of its rise and fall in the relatively short period of 35 years since it was first taught as a university subject in Australia. We consider the possible reasons behind this trajectory, in particular correlations with changing…

  10. Universality in generalized models of inflation

    Energy Technology Data Exchange (ETDEWEB)

    Binétruy, P.; Pieroni, M. [AstroParticule et Cosmologie, Université Paris Diderot, CNRS, CEA, Observatoire de Paris, Sorbonne Paris Cité, 10, rue Alice Domon et Léonie Duquet, F-75205 Paris Cedex 13 (France); Mabillard, J., E-mail: pierre.binetruy@apc.univ-paris7.fr, E-mail: joel.mabillard@ed.ac.uk, E-mail: mauro.pieroni@apc.in2p3.fr [School of Physics and Astronomy, University of Edinburgh, Edinburgh, EH9 3JZ (United Kingdom)

    2017-03-01

    We discuss the cosmological evolution of a scalar field with non standard kinetic term in terms of a Renormalization Group Equation (RGE). In this framework inflation corresponds to the slow evolution in a neighborhood of a fixed point and universality classes for inflationary models naturally arise. Using some examples we show the application of the formalism. The predicted values for the speed of sound c {sub s} {sup 2} and for the amount of non-Gaussianities produced in these models are discussed. In particular, we show that it is possible to introduce models with c {sub s} {sup 2} ≠ 1 that can be in agreement with present cosmological observations.

  11. Portable University Model of the Atmosphere (PUMA)

    Energy Technology Data Exchange (ETDEWEB)

    Fraedrich, K.; Kirk, E.; Lunkeit, F. [Hamburg Univ. (Germany). Meteorologisches Inst.

    1998-10-01

    The Portable University Model of the Atmosphere (PUMA) is based on the Reading multi-level spectral model SGCM (Simple Global Circulation Model) described by Hoskins and Simmons (1975) and James and Gray (1986). Originally developed as a numerical prediction model, it was changed to perform as a circulation model. For example, James and Gray (1986) studied the influence of surface friction on the circulation of a baroclinic atmosphere, James and James (1992), and James et al. (1994) investigated ultra-low-frequency variability, and Mole and James (1990) analyzed the baroclinic adjustment in the context of a zonally varying flow. Frisius et al. (1998) simulated an idealized storm track by embedding a dipole structure in a zonally symmetric forcing field and Lunkeit et al. (1998) investigated the sensitivity of GCM (General Circulation Model) scenarios by an adaption technique applicapable to SGCMs. (orig.)

  12. Evaluation of recent quantitative magnetospheric magnetic field models

    International Nuclear Information System (INIS)

    Walker, R.J.

    1976-01-01

    Recent quantitative magnetospheric field models contain many features not found in earlier models. Magnetopause models which include the effects of the dipole tilt were presented. More realistic models of the tail field include tail currents which close on the magnetopause, cross-tail currents of finite thickness, and cross-tail current models which model the position of the neutral sheet as a function of tilt. Finally, models have attempted to calculate the field of currents distributed in the inner magnetosphere. As the purpose of a magnetospheric model is to provide a mathematical description of the field that reasonably reproduces the observed magnetospheric field, several recent models were compared with the observed ΔB(B/sub observed/--B/sub main field/) contours. Models containing only contributions from magnetopause and tail current systems are able to reproduce the observed quiet time field only in an extremely qualitative way. The best quantitative agreement between models and observations occurs when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. However, the distributed current models are valid only for zero tilt. Even the models which reproduce the average observed field reasonably well may not give physically reasonable field gradients. Three of the models evaluated contain regions in the near tail in which the field gradient reverses direction. One region in which all the models fall short is that around the polar cusp, though most can be used to calculate the position of the last closed field line reasonably well

  13. Creating a Universe, a Conceptual Model

    Directory of Open Access Journals (Sweden)

    James R. Johnson

    2016-10-01

    Full Text Available Space is something. Space inherently contains laws of nature: universal rules (mathematics, space dimensions, types of forces, types of fields, and particle species, laws (relativity, quantum mechanics, thermodynamics, and electromagnetism and symmetries (Lorentz, Gauge, and symmetry breaking. We have significant knowledge about these laws of nature because all our scientific theories assume their presence. Their existence is critical for developing either a unique theory of our universe or more speculative multiverse theories. Scientists generally ignore the laws of nature because they “are what they are” and because visualizing different laws of nature challenges the imagination. This article defines a conceptual model separating space (laws of nature from the universe’s energy source (initial conditions and expansion (big bang. By considering the ramifications of changing the laws of nature, initial condition parameters, and two variables in the big bang theory, the model demonstrates that traditional fine tuning is not the whole story when creating a universe. Supporting the model, space and “nothing” are related to the laws of nature, mathematics and multiverse possibilities. Speculation on the beginning of time completes the model.

  14. Is the island universe model consistent with observations?

    OpenAIRE

    Piao, Yun-Song

    2005-01-01

    We study the island universe model, in which initially the universe is in a cosmological constant sea, then the local quantum fluctuations violating the null energy condition create the islands of matter, some of which might corresponds to our observable universe. We examine the possibility that the island universe model is regarded as an alternative scenario of the origin of observable universe.

  15. Baryogenesis model predicting antimatter in the Universe

    International Nuclear Information System (INIS)

    Kirilova, D.

    2003-01-01

    Cosmic ray and gamma-ray data do not rule out antimatter domains in the Universe, separated at distances bigger than 10 Mpc from us. Hence, it is interesting to analyze the possible generation of vast antimatter structures during the early Universe evolution. We discuss a SUSY-condensate baryogenesis model, predicting large separated regions of matter and antimatter. The model provides generation of the small locally observed baryon asymmetry for a natural initial conditions, it predicts vast antimatter domains, separated from the matter ones by baryonically empty voids. The characteristic scale of antimatter regions and their distance from the matter ones is in accordance with observational constraints from cosmic ray, gamma-ray and cosmic microwave background anisotropy data

  16. Variable selection in near infrared spectroscopy for quantitative models of homologous analogs of cephalosporins

    Directory of Open Access Journals (Sweden)

    Yan-Chun Feng

    2014-07-01

    Full Text Available Two universal spectral ranges (4550–4100 cm-1 and 6190–5510 cm-1 for construction of quantitative models of homologous analogs of cephalosporins were proposed by evaluating the performance of five spectral ranges and their combinations, using three data sets of cephalosporins for injection, i.e., cefuroxime sodium, ceftriaxone sodium and cefoperazone sodium. Subsequently, the proposed ranges were validated by using eight calibration sets of other homologous analogs of cephalosporins for injection, namely cefmenoxime hydrochloride, ceftezole sodium, cefmetazole, cefoxitin sodium, cefotaxime sodium, cefradine, cephazolin sodium and ceftizoxime sodium. All the constructed quantitative models for the eight kinds of cephalosporins using these universal ranges could fulfill the requirements for quick quantification. After that, competitive adaptive reweighted sampling (CARS algorithm and infrared (IR–near infrared (NIR two-dimensional (2D correlation spectral analysis were used to determine the scientific basis of these two spectral ranges as the universal regions for the construction of quantitative models of cephalosporins. The CARS algorithm demonstrated that the ranges of 4550–4100 cm-1 and 6190–5510 cm-1 included some key wavenumbers which could be attributed to content changes of cephalosporins. The IR–NIR 2D spectral analysis showed that certain wavenumbers in these two regions have strong correlations to the structures of those cephalosporins that were easy to degrade.

  17. Performance Theories for Sentence Coding: Some Quantitative Models

    Science.gov (United States)

    Aaronson, Doris; And Others

    1977-01-01

    This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)

  18. Modeling Logistic Performance in Quantitative Microbial Risk Assessment

    NARCIS (Netherlands)

    Rijgersberg, H.; Tromp, S.O.; Jacxsens, L.; Uyttendaele, M.

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage

  19. Universality classes for models of inflation

    CERN Document Server

    Binetruy, P.; Mabillard, J.; Pieroni, M.; Rosset, C.

    2015-01-01

    We show that the cosmological evolution of a scalar field in a potential can be obtained from a renormalisation group equation. The slow roll regime of inflation models is understood in this context as the slow evolution close to a fixed point, described by the methods of renormalisation group. This explains in part the universality observed in the predictions of a certain number of inflation models. We illustrate this behavior on a certain number of examples and discuss it in the context of the AdS/CFT correspondence.

  20. The Monash University Interactive Simple Climate Model

    Science.gov (United States)

    Dommenget, D.

    2013-12-01

    The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.

  1. On the generation of a bubbly universe - A quantitative assessment of the CfA slice

    Science.gov (United States)

    Ostriker, J. P.; Strassler, M. J.

    1989-01-01

    A first attempt is made to calculate the properties of the matter distribution in a universe filled with overlapping bubbles produced by multiple explosions. Each spherical shell follows the cosmological Sedov-Taylor solution until it encounters another shell. Thereafter, mergers are allowed to occur in pairs on the basis of N-body results. At the final epoch, the matrix of overlapping shells is populated with 'galaxies' and the properties of slices through the numerically constructed cube compare well with CfA survey results for specified initial conditions. A statistic is found which measures the distance distribution from uniformly distributed points to the nearest galaxies on the projected plane which appears to provide a good measure of the bubbly character of the galaxy distribution. In a quantitative analysis of the CfA 'slice of the universe', a very good match is found between simulation and the real data for final average bubble radii of (13.5 + or - 1.5)/h Mpc with formal filling factor 1.0-1.5 or actual filling factor of 65-80 percent.

  2. Generation of a bubbly universe - a quantitative assessment of the CfA slice

    International Nuclear Information System (INIS)

    Ostriker, J.P.; Strassler, M.J.

    1989-01-01

    A first attempt is made to calculate the properties of the matter distribution in a universe filled with overlapping bubbles produced by multiple explosions. Each spherical shell follows the cosmological Sedov-Taylor solution until it encounters another shell. Thereafter, mergers are allowed to occur in pairs on the basis of N-body results. At the final epoch, the matrix of overlapping shells is populated with 'galaxies' and the properties of slices through the numerically constructed cube compare well with CfA survey results for specified initial conditions. A statistic is found which measures the distance distribution from uniformly distributed points to the nearest galaxies on the projected plane which appears to provide a good measure of the bubbly character of the galaxy distribution. In a quantitative analysis of the CfA 'slice of the universe', a very good match is found between simulation and the real data for final average bubble radii of (13.5 + or - 1.5)/h Mpc with formal filling factor 1.0-1.5 or actual filling factor of 65-80 percent. 25 references

  3. Polymorphic ethyl alcohol as a model system for the quantitative study of glassy behaviour

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H E; Schober, H; Gonzalez, M A [Institut Max von Laue - Paul Langevin (ILL), 38 - Grenoble (France); Bermejo, F J; Fayos, R; Dawidowski, J [Consejo Superior de Investigaciones Cientificas, Madrid (Spain); Ramos, M A; Vieira, S [Universidad Autonoma de Madrid (Spain)

    1997-04-01

    The nearly universal transport and dynamical properties of amorphous materials or glasses are investigated. Reasonably successful phenomenological models have been developed to account for these properties as well as the behaviour near the glass-transition, but quantitative microscopic models have had limited success. One hindrance to these investigations has been the lack of a material which exhibits glass-like properties in more than one phase at a given temperature. This report presents results of neutron-scattering experiments for one such material ordinary ethyl alcohol, which promises to be a model system for future investigations of glassy behaviour. (author). 8 refs.

  4. A universal model of giftedness - adaptation of the Munich Model

    NARCIS (Netherlands)

    Jessurun, J.H.; Shearer, C.B.; Weggeman, M.C.D.P.

    2016-01-01

    The Munich Model of Giftedness (MMG) by Heller and his colleagues, developed for the identification of gifted children, is adapted and expanded, with the aim of making it more universally usable as a model for the pathway from talents to performance. On the side of the talent-factors, the concept of

  5. Baryon asymmetry of the Universe in the standard model

    International Nuclear Information System (INIS)

    Farrar, G.R.; Shaposhnikov, M.E.

    1994-01-01

    We study the interactions of quarks and antiquarks with the changing Higgs field during the electroweak phase transition, including quantum mechanical and some thermal effects, with the only source of CP violation being the known CKM phase. We show that the GIM cancellation, which has been commonly thought to imply a prediction which is at least 10 orders of magnitude too small, can be evaded in certain kinematic regimes, for instance, when the strange quark is totally reflected but the down quark is not. We report on a quantitative calculation of the asymmetry in a one-dimensional approximation based on the present understanding of the physics of the high-temperature environment, but with some aspects of the problem oversimplified. The resulting prediction for the magnitude and sign of the present baryonic asymmetry of the Universe agrees with the observed value, with moderately optimistic assumptions about the dynamics of the phase transition. Both magnitude and sign of the asymmetry have an intricate dependence on quark masses and mixings, so that quantitative agreement between prediction and observation would be highly nontrivial. At present uncertainties related to the dynamics of the EW phase transition and the oversimplifications of our treatment are too great to decide whether or not this is the correct explanation for the presence of remnant matter in our Universe; however, the present work makes it clear that the minimal standard model cannot be discounted as a contender for explaining this phenomenon

  6. EXPENSES FORECASTING MODEL IN UNIVERSITY PROJECTS PLANNING

    Directory of Open Access Journals (Sweden)

    Sergei A. Arustamov

    2016-11-01

    Full Text Available The paper deals with mathematical model presentation of cash flows in project funding. We describe different types of expenses linked to university project activities. Problems of project budgeting that contribute most uncertainty have been revealed. As an example of the model implementation we consider calculation of vacation allowance expenses for project participants. We define problems of forecast for funds reservation: calculation based on methodology established by the Ministry of Education and Science calculation according to the vacation schedule and prediction of the most probable amount. A stochastic model for vacation allowance expenses has been developed. We have proposed methods and solution of the problems that increase the accuracy of forecasting for funds reservation based on 2015 data.

  7. Universality in a Neutral Evolution Model

    Science.gov (United States)

    King, Dawn; Scott, Adam; Maric, Nevena; Bahar, Sonya

    2013-03-01

    Agent-based models are ideal for investigating the complex problems of biodiversity and speciation because they allow for complex interactions between individuals and between individuals and the environment. Presented here is a ``null'' model that investigates three mating types - assortative, bacterial, and random - in phenotype space, as a function of the percentage of random death δ. Previous work has shown phase transition behavior in an assortative mating model with variable fitness landscapes as the maximum mutation size (μ) was varied (Dees and Bahar, 2010). Similarly, this behavior was recently presented in the work of Scott et al. (submitted), on a completely neutral landscape, for bacterial-like fission as well as for assortative mating. Here, in order to achieve an appropriate ``null'' hypothesis, the random death process was changed so each individual, in each generation, has the same probability of death. Results show a continuous nonequilibrium phase transition for the order parameters of the population size and the number of clusters (analogue of species) as δ is varied for three different mutation sizes of the system. The system shows increasing robustness as μ increases. Universality classes and percolation properties of this system are also explored. This research was supported by funding from: University of Missouri Research Board and James S. McDonnell Foundation

  8. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  9. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  10. A Review on Quantitative Models for Sustainable Food Logistics Management

    Directory of Open Access Journals (Sweden)

    M. Soysal

    2012-12-01

    Full Text Available The last two decades food logistics systems have seen the transition from a focus on traditional supply chain management to food supply chain management, and successively, to sustainable food supply chain management. The main aim of this study is to identify key logistical aims in these three phases and analyse currently available quantitative models to point out modelling challenges in sustainable food logistics management (SFLM. A literature review on quantitative studies is conducted and also qualitative studies are consulted to understand the key logistical aims more clearly and to identify relevant system scope issues. Results show that research on SFLM has been progressively developing according to the needs of the food industry. However, the intrinsic characteristics of food products and processes have not yet been handled properly in the identified studies. The majority of the works reviewed have not contemplated on sustainability problems, apart from a few recent studies. Therefore, the study concludes that new and advanced quantitative models are needed that take specific SFLM requirements from practice into consideration to support business decisions and capture food supply chain dynamics.

  11. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  12. [The strategic research areas of a University Hospital: proposal of a quali-quantitative method.

    Science.gov (United States)

    Iezzi, Elisa; Ardissino, Diego; Ferrari, Carlo; Vitale, Marco; Caminiti, Caterina

    2018-02-01

    This work aimed to objectively identify the main research areas at the University Hospital of Parma. To this end, a multidisciplinary working group, comprising clinicians, researchers, and hospital management, was formed to develop a shared quali-quantitative method. Easily retrievable performance indicators were selected from the literature (concerning bibliometric data and grant acquisition), and a scoring system developed to assign weights to each indicator. Subsequently, Research Team Leaders were identified from the hospital's "Research Plan", a document produced every three years which contains information on the main research themes carried out at each Department, involved staff and available resources, provided by health care professionals themselves. The selected performance indicators were measured for each Team Leader, and scores assigned, thus creating a ranking list. Through the analyses of the research themes of top Team Leaders, the Working Group identified the following five strategic research areas: (a) personalized treatment in oncology and hematology; (b) chronicization mechanisms in immunomediate diseases; (c) old and new risk factors for cardiovascular diseases; (d) nutritional disorders, metabolic and chronic-degenerative diseases; (e) molecular diagnostic and predictive markers. We have developed an objective method to identify a hospital's main research areas. Its application can guide resource allocation and can offer ways to value the work of professionals involved in research.

  13. Reducing Math Anxiety: Findings from Incorporating Service Learning into a Quantitative Reasoning Course at Seattle University

    Directory of Open Access Journals (Sweden)

    Allison Henrich

    2011-07-01

    Full Text Available How might one teach mathematics to math-anxious students and at the same time reduce their math anxiety? This paper describes what we found when we incorporated a service learning component into a quantitative reasoning course at Seattle University in Fall 2010 (20 students and Spring 2011 (28 students. The course is taken primarily by humanities majors, many of whom would not take a course in math if they didn’t need to satisfy the university’s core requirement. For the service learning component, each student met with and tutored children at local schools for 1-2 hours per week (total about 15 service hours, kept a weekly journal reflecting on the experience, and wrote a five-page final paper on the importance and reasonable expectations of mathematics literacy. The autobiographies, self-description at the beginning of the class, focus group interviews at the end of the term, journal entries, final essays, and student evaluations indicated that the students gained confidence in their mathematical abilities, a greater interest in mathematics, and a broader sense of the importance of math literacy in modern society. One notable finding was that students discovered that the act of manufacturing enthusiasm about math as a tool for tutoring the children made them more enthusiastic about math in their own courses.

  14. A fractal model of the Universe

    Science.gov (United States)

    Gottlieb, Ioan

    The book represents a revisioned, extended, completed and translated version of the book "Superposed Universes. A scientific novel and a SF story" (1995). The book contains a hypothesis by the author concerning the complexity of the Nature. An introduction to the theories of numbers, manyfolds and topology is given. The possible connection with the theory of evolution of the Universe is discussed. The book contains also in the last chapter a SF story based on the hypothesis presented. A connection with fractals theory is given. A part of his earlier studies (1955-1956) were subsequently published without citation by Ali Kyrala (Phys. Rev. vol.117, No.5, march 1, 1960). The book contains as an important appendix the early papers (some of which are published in the coauthoprship with his scientific advisors): 1) T.T. Vescan, A. Weiszmann and I.Gottlieb, Contributii la studiul problemelor geometrice ale teoriei relativitatii restranse. Academia R.P.R. Baza Timisoara. Lucrarile consfatuirii de geometrie diferentiala din 9-12 iunie 1955. In this paper the authors show a new method of the calculation of the metrics. 2) Jean Gottlieb, L'hyphotese d'un modele de la structure de la matiere, Revista Matematica y Fisica Teorica, Serie A, Volumen XY, No.1, y.2, 1964 3) I. Gottlieb, Some hypotheses on space, time and gravitation, Studies in Gravitation Theory, CIP Press, Bucharest, 1988, pp.227-234 as well as some recent papers (published in the coauthorship with his disciples): 4)M. Agop, Gottlieb speace-time. A fractal axiomatic model of the Universe. in Particles and Fields, Editors: M.Agop and P.D. Ioannou, Athens University Press, 2005, pp. 59-141 5) I. Gottlieb, M.Agop and V.Enache, Games with Cantor's dust. Chaos, Solitons and Fractals, vol.40 (2009) pp. 940-945 6) I. Gottlieb, My picture over the World, Bull. of the Polytechnic Institute of Iasi. Tom LVI)LX, Fasc. 1, 2010, pp. 1-18. The book contains also a dedication to father Vasile Gottlieb and wife Cleopatra

  15. Universally sloppy parameter sensitivities in systems biology models.

    Directory of Open Access Journals (Sweden)

    Ryan N Gutenkunst

    2007-10-01

    Full Text Available Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  16. Universally sloppy parameter sensitivities in systems biology models.

    Science.gov (United States)

    Gutenkunst, Ryan N; Waterfall, Joshua J; Casey, Fergal P; Brown, Kevin S; Myers, Christopher R; Sethna, James P

    2007-10-01

    Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring in vivo biochemical parameters is difficult, and collectively fitting them to other experimental data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a "sloppy" spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our growth-factor-signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.

  17. Wires in the soup: quantitative models of cell signaling

    Science.gov (United States)

    Cheong, Raymond; Levchenko, Andre

    2014-01-01

    Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655

  18. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  19. Quantitative Systems Pharmacology: A Case for Disease Models.

    Science.gov (United States)

    Musante, C J; Ramanujan, S; Schmidt, B J; Ghobrial, O G; Lu, J; Heatherington, A C

    2017-01-01

    Quantitative systems pharmacology (QSP) has emerged as an innovative approach in model-informed drug discovery and development, supporting program decisions from exploratory research through late-stage clinical trials. In this commentary, we discuss the unique value of disease-scale "platform" QSP models that are amenable to reuse and repurposing to support diverse clinical decisions in ways distinct from other pharmacometrics strategies. © 2016 The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of The American Society for Clinical Pharmacology and Therapeutics.

  20. Quantitative model of New Zealand's energy supply industry

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B. R. [Victoria Univ., Wellington, (New Zealand); Lucas, P. D. [Ministry of Energy Resources (New Zealand)

    1977-10-15

    A mathematical model is presented to assist in an analysis of energy policy options available. The model is based on an engineering orientated description of New Zealand's energy supply and distribution system. The system is cast as a linear program, in which energy demand is satisfied at least cost. The capacities and operating modes of process plant (such as power stations, oil refinery units, and LP-gas extraction plants) are determined by the model, as well as the optimal mix of fuels supplied to the final consumers. Policy analysis with the model enables a wide ranging assessment of the alternatives and uncertainties within a consistent quantitative framework. It is intended that the model be used as a tool to investigate the relative effects of various policy options, rather than to present a definitive plan for satisfying the nation's energy requirements.

  1. Towards a universal model of reading.

    Science.gov (United States)

    Frost, Ram

    2012-10-01

    In the last decade, reading research has seen a paradigmatic shift. A new wave of computational models of orthographic processing that offer various forms of noisy position or context-sensitive coding have revolutionized the field of visual word recognition. The influx of such models stems mainly from consistent findings, coming mostly from European languages, regarding an apparent insensitivity of skilled readers to letter order. Underlying the current revolution is the theoretical assumption that the insensitivity of readers to letter order reflects the special way in which the human brain encodes the position of letters in printed words. The present article discusses the theoretical shortcomings and misconceptions of this approach to visual word recognition. A systematic review of data obtained from a variety of languages demonstrates that letter-order insensitivity is neither a general property of the cognitive system nor a property of the brain in encoding letters. Rather, it is a variant and idiosyncratic characteristic of some languages, mostly European, reflecting a strategy of optimizing encoding resources, given the specific structure of words. Since the main goal of reading research is to develop theories that describe the fundamental and invariant phenomena of reading across orthographies, an alternative approach to model visual word recognition is offered. The dimensions of a possible universal model of reading, which outlines the common cognitive operations involved in orthographic processing in all writing systems, are discussed.

  2. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  3. Sustainable Education: Analyzing the Determinants of University Student Dropout by Nonlinear Panel Data Models

    Directory of Open Access Journals (Sweden)

    Donggeun Kim

    2018-03-01

    Full Text Available University dropout is a serious problem. It affects not only the individual who drops out but also the university and society. However, most previous studies have focused only on the subjective/individual level. University dropout is a very important issue in South Korea, but it has not received much research attention so far. This study examined the possible causes of university dropout in South Korea at the aggregate level, focusing on four fundamental categories: students, resources, faculty, and university characteristics. Three-year balanced panel data from 2013 to 2015 were constructed and estimated by using nonlinear panel data models. The findings show that cost and burden for students, financial resources, qualitative and quantitative features of faculty, and type/size of the university have significant effects on university dropout.

  4. A website evaluation model by integration of previous evaluation models using a quantitative approach

    Directory of Open Access Journals (Sweden)

    Ali Moeini

    2015-01-01

    Full Text Available Regarding the ecommerce growth, websites play an essential role in business success. Therefore, many authors have offered website evaluation models since 1995. Although, the multiplicity and diversity of evaluation models make it difficult to integrate them into a single comprehensive model. In this paper a quantitative method has been used to integrate previous models into a comprehensive model that is compatible with them. In this approach the researcher judgment has no role in integration of models and the new model takes its validity from 93 previous models and systematic quantitative approach.

  5. Hybrid Speaker Recognition Using Universal Acoustic Model

    Science.gov (United States)

    Nishimura, Jun; Kuroda, Tadahiro

    We propose a novel speaker recognition approach using a speaker-independent universal acoustic model (UAM) for sensornet applications. In sensornet applications such as “Business Microscope”, interactions among knowledge workers in an organization can be visualized by sensing face-to-face communication using wearable sensor nodes. In conventional studies, speakers are detected by comparing energy of input speech signals among the nodes. However, there are often synchronization errors among the nodes which degrade the speaker recognition performance. By focusing on property of the speaker's acoustic channel, UAM can provide robustness against the synchronization error. The overall speaker recognition accuracy is improved by combining UAM with the energy-based approach. For 0.1s speech inputs and 4 subjects, speaker recognition accuracy of 94% is achieved at the synchronization error less than 100ms.

  6. Universal free school breakfast: a qualitative model for breakfast behaviors

    Directory of Open Access Journals (Sweden)

    Louise eHarvey-Golding

    2015-06-01

    Full Text Available In recent years the provision of school breakfast has increased significantly in the UK. However, research examining the effectiveness of school breakfast is still within relative stages of infancy, and findings to date have been rather mixed. Moreover, previous evaluations of school breakfast schemes have been predominantly quantitative in their methodologies. Presently there are few qualitative studies examining the subjective perceptions and experiences of stakeholders, and thereby an absence of knowledge regarding the sociocultural impacts of school breakfast. The purpose of this study was to investigate the beliefs, views and attitudes, and breakfast consumption behaviors, among key stakeholders, served by a council-wide universal free school breakfast initiative, within the North West of England, UK. A sample of children, parents and school staff were recruited from three primary schools, participating in the universal free school breakfast scheme, to partake in semi-structured interviews and small focus groups. A Grounded Theory analysis of the data collected identified a theoretical model of breakfast behaviors, underpinned by the subjective perceptions and experiences of these key stakeholders. The model comprises of three domains relating to breakfast behaviors, and the internal and external factors that are perceived to influence breakfast behaviors, among children, parents and school staff. Findings were validated using triangulation methods, member checks and inter-rater reliability measures. In presenting this theoretically grounded model for breakfast behaviors, this paper provides a unique qualitative insight into the breakfast consumption behaviors and barriers to breakfast consumption, within a socioeconomically deprived community, participating in a universal free school breakfast intervention program.

  7. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  8. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  9. Modelling of web-based virtual university administration for Nigerian ...

    African Journals Online (AJOL)

    This research work focused on development of a model of web based virtual University Administration for Nigerian universities. This is necessary as there is still a noticeable administrative constraint in our Universities, the establishment of many University Web portals notwithstanding. More efforts are therefore needed to ...

  10. Quantitative Assessment of Theses at Mazandaran University of Medical Sciences Years-(1995-2014).

    Science.gov (United States)

    Balaghafari, Azita; Siamian, Hasan; Kharamin, Farideh; Rashida, Seyyedeh Shahrbanoo; Ghahrani, Nassim

    2016-07-16

    Review and evaluation of research for the correct steps towards real progress is essential which is a healthy and dynamic feature of the system. For the correct step toward real progress, evaluation research is essential which is feature of healthy and dynamic system. Considering the importance of scientific thesis in production and development and be aware of as the lack of structured information and qualitative and quantitative assessment at Mazandaran University of Medical Sciences, therefore we decided to do qualitative stud of theirs prepared 1995-2014. This study was a descriptive survey, a sample of 325 graduate and PhD thesis and dissertation in clinical and basic science at the university of medical sciences of the population in 2060 is a thesis from 1994 to the end of 2014. To study the population, stratified sampling method was used. The descriptive study was conducted in terms of matching the degree thesis students, theses subjects, specialty of supervisors and Advisers. The data gathering tool was checklist of information (gender, discipline, degree and department education of students, School, year of dependence, title of theses and dissertations, specialty and departments of supervisors and advisers, type of research, grade obtained of students). Statistical analysis of the data was performed using 21 SPSS software. We studied 325 theses; 303 dissertations which 1 researcher; 21 dissertations which 2 researchers and 1 dissertation with 3 researchers. A total of 348 students (174 females and 174 males) researcher had theses. The number of students in the Department of Basic Science 82 (23.5%), 266 (76.5 %) in clinical group; 29(8.33%), 29 (8.33%) master degree; 260 (74.71%) general practitioner; 58 (16.67%) specialty and 1(29) at the PhD level. There was no relationship between research and level of education (p = 0.081). However, it was found that majority of the theses for the general practitioner (59.8%) wryer type 1(status condition). By matching

  11. Quantitative Assessment of Theses at Mazandaran University of Medical Sciences Years–(1995-2014)

    Science.gov (United States)

    Balaghafari, Azita; Siamian, Hasan; Kharamin, Farideh; Rashida, Seyyedeh Shahrbanoo; Ghahrani, Nassim

    2016-01-01

    Background: Review and evaluation of research for the correct steps towards real progress is essential which is a healthy and dynamic feature of the system. For the correct step toward real progress, evaluation research is essential which is feature of healthy and dynamic system. Considering the importance of scientific thesis in production and development and be aware of as the lack of structured information and qualitative and quantitative assessment at Mazandaran University of Medical Sciences, therefore we decided to do qualitative stud of theirs prepared 1995-2014. Methods: This study was a descriptive survey, a sample of 325 graduate and PhD thesis and dissertation in clinical and basic science at the university of medical sciences of the population in 2060 is a thesis from 1994 to the end of 2014. To study the population, stratified sampling method was used. The descriptive study was conducted in terms of matching the degree thesis students, theses subjects, specialty of supervisors and Advisers. The data gathering tool was checklist of information (gender, discipline, degree and department education of students, School, year of dependence, title of theses and dissertations, specialty and departments of supervisors and advisers, type of research, grade obtained of students). Statistical analysis of the data was performed using 21 SPSS software. Results: We studied 325 theses; 303 dissertations which 1 researcher; 21 dissertations which 2 researchers and 1 dissertation with 3 researchers. A total of 348 students (174 females and 174 males) researcher had theses. The number of students in the Department of Basic Science 82 (23.5%), 266 (76.5 %) in clinical group; 29(8.33%), 29 (8.33%) master degree; 260 (74.71%) general practitioner; 58 (16.67%) specialty and 1(29) at the PhD level. There was no relationship between research and level of education (p = 0.081). However, it was found that majority of the theses for the general practitioner (59.8%) wryer type 1

  12. Quantitative Literacy Interventions at University of Cape Town: Effects of Separation from Academic Disciplines

    Directory of Open Access Journals (Sweden)

    Vera Frith

    2012-01-01

    Full Text Available The aim of the Numeracy Centre at the University of Cape Town is to develop students’ quantitative literacy (QL in a manner consistent with their programmes of study and intended roles in the community. Our theoretical perspective on the nature of QL is in line with that of the New Literacies Studies and sees academic QL as practices in different academic disciplinary contexts. This means that for us the ideal curriculum structure for developing QL would fully integrate it into the teaching of the disciplines. This is in practice not achievable in most cases, especially since many students do not have the necessary foundations of mathematical and statistical knowledge and skills. The unavoidable deviation from the ideal curriculum structure presents challenges to the design of QL interventions. Two illustrative examples which display different degrees of separation from the disciplinary teaching are described and discussed. This discussion is based on lecturers’ reflections on the teaching experience and on student evaluations. The ‘stand-alone’ QL course for Humanities and Law students, which uses a context-based approach, is the least integrated with the disciplinary curriculum, and presents challenges in terms of tensions in the classroom between the contexts and the mathematical and statistical content, as well as challenges in terms of student motivation. The QL intervention for medical students is more closely integrated into the medical curriculum and presents fewer challenges. Both interventions are intended to provide ‘foundations’ in terms of QL and suffer from difficulties in providing students with authentic motivation.

  13. Quantitative aspects and dynamic modelling of glucosinolate metabolism

    DEFF Research Database (Denmark)

    Vik, Daniel

    . This enables comparison of transcript and protein levels across mutants and upon induction. I find that unchallenged plants show good correspondence between protein and transcript, but that treatment with methyljasmonate results in significant differences (chapter 1). Functional genomics are used to study......). The construction a dynamic quantitative model of GLS hydrolysis is described. Simulations reveal potential effects on auxin signalling that could reflect defensive strategies (chapter 4). The results presented grant insights into, not only the dynamics of GLS biosynthesis and hydrolysis, but also the relationship...

  14. Model for Quantitative Evaluation of Enzyme Replacement Treatment

    Directory of Open Access Journals (Sweden)

    Radeva B.

    2009-12-01

    Full Text Available Gaucher disease is the most frequent lysosomal disorder. Its enzyme replacement treatment was the new progress of modern biotechnology, successfully used in the last years. The evaluation of optimal dose of each patient is important due to health and economical reasons. The enzyme replacement is the most expensive treatment. It must be held continuously and without interruption. Since 2001, the enzyme replacement therapy with Cerezyme*Genzyme was formally introduced in Bulgaria, but after some time it was interrupted for 1-2 months. The dose of the patients was not optimal. The aim of our work is to find a mathematical model for quantitative evaluation of ERT of Gaucher disease. The model applies a kind of software called "Statistika 6" via the input of the individual data of 5-year-old children having the Gaucher disease treated with Cerezyme. The output results of the model gave possibilities for quantitative evaluation of the individual trends in the development of the disease of each child and its correlation. On the basis of this results, we might recommend suitable changes in ERT.

  15. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  16. University - industry collaborations: models, drivers and cultures.

    Science.gov (United States)

    Ehrismann, Dominic; Patel, Dhavalkumar

    2015-01-01

    The way academic institutions and pharmaceutical companies have been approaching collaborations has changed significantly in recent years. A multitude of interaction models were tested and critical factors that drive successful collaborations have been proposed. Based on this experience the current consensus in the pharmaceutical industry is to pursue one of two strategies: an open innovation approach to source discoveries wherever they occur, or investing selectively into scientific partnerships that churn out inventions that can be translated from bench to bedside internally. While these strategies may be intuitive, to form and build sustainable relationships between academia and large multinational healthcare enterprises is proving challenging. In this article we explore some of the more testing aspects of these collaborations, approaches that various industrial players have taken and provide our own views on the matter. We found that understanding and respecting each other's organisational culture and combining the intellectual and technological assets to answer big scientific questions accelerates and improves the quality of every collaboration. Upon discussing the prevailing cooperation models in the university - industry domain, we assert that science-driven collaborations where risks and rewards are shared equally without a commercial agenda in mind are the most impactful.

  17. A Model for the Development of University Curricula in Nanoelectronics

    Science.gov (United States)

    Bruun, E.; Nielsen, I.

    2010-01-01

    Nanotechnology is having an increasing impact on university curricula in electrical engineering and in physics. Major influencers affecting developments in university programmes related to nanoelectronics are discussed and a model for university programme development is described. The model takes into account that nanotechnology affects not only…

  18. The Triad Research University or a Post 20th Century Research University Model

    Science.gov (United States)

    Tadmor, Zehev

    2006-01-01

    In this paper, a model for the future research university is proposed, which answers some of the key challenges facing universities. It consists of three independent yet closely knitted entities: a research institute, a university teaching college and a business unit creating a "triad" structure. The possible inevitability, the advantages and…

  19. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    Science.gov (United States)

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  20. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction

    Directory of Open Access Journals (Sweden)

    Cobbs Gary

    2012-08-01

    Full Text Available Abstract Background Numerous models for use in interpreting quantitative PCR (qPCR data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Results Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the

  1. Stepwise kinetic equilibrium models of quantitative polymerase chain reaction.

    Science.gov (United States)

    Cobbs, Gary

    2012-08-16

    Numerous models for use in interpreting quantitative PCR (qPCR) data are present in recent literature. The most commonly used models assume the amplification in qPCR is exponential and fit an exponential model with a constant rate of increase to a select part of the curve. Kinetic theory may be used to model the annealing phase and does not assume constant efficiency of amplification. Mechanistic models describing the annealing phase with kinetic theory offer the most potential for accurate interpretation of qPCR data. Even so, they have not been thoroughly investigated and are rarely used for interpretation of qPCR data. New results for kinetic modeling of qPCR are presented. Two models are presented in which the efficiency of amplification is based on equilibrium solutions for the annealing phase of the qPCR process. Model 1 assumes annealing of complementary targets strands and annealing of target and primers are both reversible reactions and reach a dynamic equilibrium. Model 2 assumes all annealing reactions are nonreversible and equilibrium is static. Both models include the effect of primer concentration during the annealing phase. Analytic formulae are given for the equilibrium values of all single and double stranded molecules at the end of the annealing step. The equilibrium values are then used in a stepwise method to describe the whole qPCR process. Rate constants of kinetic models are the same for solutions that are identical except for possibly having different initial target concentrations. Analysis of qPCR curves from such solutions are thus analyzed by simultaneous non-linear curve fitting with the same rate constant values applying to all curves and each curve having a unique value for initial target concentration. The models were fit to two data sets for which the true initial target concentrations are known. Both models give better fit to observed qPCR data than other kinetic models present in the literature. They also give better estimates of

  2. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  3. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  4. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  6. Quantitative modeling of the ionospheric response to geomagnetic activity

    Directory of Open Access Journals (Sweden)

    T. J. Fuller-Rowell

    2000-07-01

    Full Text Available A physical model of the coupled thermosphere and ionosphere has been used to determine the accuracy of model predictions of the ionospheric response to geomagnetic activity, and assess our understanding of the physical processes. The physical model is driven by empirical descriptions of the high-latitude electric field and auroral precipitation, as measures of the strength of the magnetospheric sources of energy and momentum to the upper atmosphere. Both sources are keyed to the time-dependent TIROS/NOAA auroral power index. The output of the model is the departure of the ionospheric F region from the normal climatological mean. A 50-day interval towards the end of 1997 has been simulated with the model for two cases. The first simulation uses only the electric fields and auroral forcing from the empirical models, and the second has an additional source of random electric field variability. In both cases, output from the physical model is compared with F-region data from ionosonde stations. Quantitative model/data comparisons have been performed to move beyond the conventional "visual" scientific assessment, in order to determine the value of the predictions for operational use. For this study, the ionosphere at two ionosonde stations has been studied in depth, one each from the northern and southern mid-latitudes. The model clearly captures the seasonal dependence in the ionospheric response to geomagnetic activity at mid-latitude, reproducing the tendency for decreased ion density in the summer hemisphere and increased densities in winter. In contrast to the "visual" success of the model, the detailed quantitative comparisons, which are necessary for space weather applications, are less impressive. The accuracy, or value, of the model has been quantified by evaluating the daily standard deviation, the root-mean-square error, and the correlation coefficient between the data and model predictions. The modeled quiet-time variability, or standard

  7. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids

    Directory of Open Access Journals (Sweden)

    Wilson Zoe A

    2008-06-01

    Full Text Available Abstract Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP, which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP and a complementary quenching probe (QP lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.

  8. Quantitative and Functional Requirements for Bioluminescent Cancer Models.

    Science.gov (United States)

    Feys, Lynn; Descamps, Benedicte; Vanhove, Christian; Vermeulen, Stefan; Vandesompele, J O; Vanderheyden, Katrien; Messens, Kathy; Bracke, Marc; De Wever, Olivier

    2016-01-01

    Bioluminescent cancer models are widely used but detailed quantification of the luciferase signal and functional comparison with a non-transfected control cell line are generally lacking. In the present study, we provide quantitative and functional tests for luciferase-transfected cells. We quantified the luciferase expression in BLM and HCT8/E11 transfected cancer cells, and examined the effect of long-term luciferin exposure. The present study also investigated functional differences between parental and transfected cancer cells. Our results showed that quantification of different single-cell-derived populations are superior with droplet digital polymerase chain reaction. Quantification of luciferase protein level and luciferase bioluminescent activity is only useful when there is a significant difference in copy number. Continuous exposure of cell cultures to luciferin leads to inhibitory effects on mitochondrial activity, cell growth and bioluminescence. These inhibitory effects correlate with luciferase copy number. Cell culture and mouse xenograft assays showed no significant functional differences between luciferase-transfected and parental cells. Luciferase-transfected cells should be validated by quantitative and functional assays before starting large-scale experiments. Copyright © 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.

  9. Assessing College Students’ Quantitative and Scientific Reasoning: The James Madison University Story

    Directory of Open Access Journals (Sweden)

    John D. Hathcoat

    2015-01-01

    Full Text Available Quantitative and scientific reasoning is a critical student learning outcome in higher education. Data are presented for large samples of undergraduate students who were assessed as entering freshmen and then again after completing 45-70 credit hours. Results are presented around four key issues that are central to educational assessment. First, entering freshmen with transfer credits for quantitative and scientific reasoning courses that fulfill general education requirements, on average, score similar to entering freshmen without such credit. About 97% of entering freshmen who had transfer credits received their credits through dual enrollment programs. As a sophomore-junior, students who had completed their general education requirements performed similar to students who had started, but not yet finished these requirements. Second, small to moderate correlations were observed between grade-point averages in relevant general education coursework and quantitative and scientific reasoning. Third, students’ quantitative and scientific reasoning, on average, increases from freshmen to sophomore/junior years. Finally, the proportion of students who meet faculty-set standards substantially increases from pre-test to post-test. Taken together, results suggest that changes in quantitative and scientific reasoning are a function of relevant courses. Additional research is needed to examine the role of lower-level versus higher-level courses in student performance. Results also indicate a need to investigate how differences in the quality of dual enrollment courses facilitate quantitative and scientific reasoning.

  10. A model for the development of university curricula in nanoelectronics

    DEFF Research Database (Denmark)

    Bruun, Erik; Nielsen, I

    2010-01-01

    Nanotechnology is having an increasing impact on university curricula in electrical engineering and in physics. Major influencers affecting developments in university programmes related to nanoelectronics are discussed and a model for university programme development is described. The model takes...... engineering. Examples of European curricula following this framework are identified and described. These examples may serve as sources of inspiration for future developments and the model...

  11. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  12. A quantitative phase field model for hydride precipitation in zirconium alloys: Part I. Development of quantitative free energy functional

    International Nuclear Information System (INIS)

    Shi, San-Qiang; Xiao, Zhihua

    2015-01-01

    A temperature dependent, quantitative free energy functional was developed for the modeling of hydride precipitation in zirconium alloys within a phase field scheme. The model takes into account crystallographic variants of hydrides, interfacial energy between hydride and matrix, interfacial energy between hydrides, elastoplastic hydride precipitation and interaction with externally applied stress. The model is fully quantitative in real time and real length scale, and simulation results were compared with limited experimental data available in the literature with a reasonable agreement. The work calls for experimental and/or theoretical investigations of some of the key material properties that are not yet available in the literature

  13. A Review of Research on Universal Design Educational Models

    Science.gov (United States)

    Rao, Kavita; Ok, Min Wook; Bryant, Brian R.

    2014-01-01

    Universal design for learning (UDL) has gained considerable attention in the field of special education, acclaimed for its promise to promote inclusion by supporting access to the general curriculum. In addition to UDL, there are two other universal design (UD) educational models referenced in the literature, universal design of instruction (UDI)…

  14. Our universe as an attractor in a superstring model

    International Nuclear Information System (INIS)

    Maeda, Keiichi.

    1986-11-01

    One preferential scenario of the evolution of the universe is discussed in a superstring model. The universe can reach the present state as an attractor in the dynamical system. The kinetic terms of the ''axions'' play an important role so that our present universe is realized almost uniquely. (author)

  15. Modeling Cancer Metastasis using Global, Quantitative and Integrative Network Biology

    DEFF Research Database (Denmark)

    Schoof, Erwin; Erler, Janine

    understanding of molecular processes which are fundamental to tumorigenesis. In Article 1, we propose a novel framework for how cancer mutations can be studied by taking into account their effect at the protein network level. In Article 2, we demonstrate how global, quantitative data on phosphorylation dynamics...... can be generated using MS, and how this can be modeled using a computational framework for deciphering kinase-substrate dynamics. This framework is described in depth in Article 3, and covers the design of KinomeXplorer, which allows the prediction of kinases responsible for modulating observed...... phosphorylation dynamics in a given biological sample. In Chapter III, we move into Integrative Network Biology, where, by combining two fundamental technologies (MS & NGS), we can obtain more in-depth insights into the links between cellular phenotype and genotype. Article 4 describes the proof...

  16. Quantitative genetic models of sexual selection by male choice.

    Science.gov (United States)

    Nakahashi, Wataru

    2008-09-01

    There are many examples of male mate choice for female traits that tend to be associated with high fertility. I develop quantitative genetic models of a female trait and a male preference to show when such a male preference can evolve. I find that a disagreement between the fertility maximum and the viability maximum of the female trait is necessary for directional male preference (preference for extreme female trait values) to evolve. Moreover, when there is a shortage of available male partners or variance in male nongenetic quality, strong male preference can evolve. Furthermore, I also show that males evolve to exhibit a stronger preference for females that are more feminine (less resemblance to males) than the average female when there is a sexual dimorphism caused by fertility selection which acts only on females.

  17. Quantitative Modelling of Trace Elements in Hard Coal.

    Science.gov (United States)

    Smoliński, Adam; Howaniec, Natalia

    2016-01-01

    The significance of coal in the world economy remains unquestionable for decades. It is also expected to be the dominant fossil fuel in the foreseeable future. The increased awareness of sustainable development reflected in the relevant regulations implies, however, the need for the development and implementation of clean coal technologies on the one hand, and adequate analytical tools on the other. The paper presents the application of the quantitative Partial Least Squares method in modeling the concentrations of trace elements (As, Ba, Cd, Co, Cr, Cu, Mn, Ni, Pb, Rb, Sr, V and Zn) in hard coal based on the physical and chemical parameters of coal, and coal ash components. The study was focused on trace elements potentially hazardous to the environment when emitted from coal processing systems. The studied data included 24 parameters determined for 132 coal samples provided by 17 coal mines of the Upper Silesian Coal Basin, Poland. Since the data set contained outliers, the construction of robust Partial Least Squares models for contaminated data set and the correct identification of outlying objects based on the robust scales were required. These enabled the development of the correct Partial Least Squares models, characterized by good fit and prediction abilities. The root mean square error was below 10% for all except for one the final Partial Least Squares models constructed, and the prediction error (root mean square error of cross-validation) exceeded 10% only for three models constructed. The study is of both cognitive and applicative importance. It presents the unique application of the chemometric methods of data exploration in modeling the content of trace elements in coal. In this way it contributes to the development of useful tools of coal quality assessment.

  18. Transition from AdS universe to DS universe in the BPP model

    International Nuclear Information System (INIS)

    Kim, Wontae; Yoon, Myungseok

    2007-01-01

    It can be shown that in the BPP model the smooth phase transition from the asymptotically decelerated AdS universe to the asymptotically accelerated DS universe is possible by solving the modified semiclassical equations of motion. This transition comes from noncommutative Poisson algebra, which gives the constant curvature scalars asymptotically. The decelerated expansion of the early universe is due to the negative energy density with the negative pressure induced by quantum back reaction, and the accelerated late-time universe comes from the positive energy and the negative pressure which behave like dark energy source in recent cosmological models

  19. A robust quantitative near infrared modeling approach for blend monitoring.

    Science.gov (United States)

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  20. Introducing a model of organizational envy management among university faculty members: A mixed research approach

    Directory of Open Access Journals (Sweden)

    Maris Zarin Daneshvar

    2016-01-01

    Full Text Available The present study aimed at offering a model of organizational envy management among faculty members of Islamic Azad Universities of East Azerbaijan Province. A mixed method through involving qualitative data and then quantitative data emphasizing on quantitative analysis. Population of the study was the entire faculty members with associate or higher degree in the educational year of 2014-2015. In the qualitative stage 20 individuals (experts were selected to design the primary model and questionnaire, and to fit the model 316 faculty members were selected. In the qualitative section it was specified that influential variables on envy management in faculty members are health organizational climate, spiritual leadership, effective communication, job satisfaction and professional development of professors and approved, as well in the quantitative section findings showed that there is a significant relationship between effective variables so that in indirect analysis of effect of organizational climate on envy management, the variable of spiritual leadership via the variable of effective communication had little effect on envy management than variables of professional development and job satisfaction. It is concluded that university managers should provide conditions and backgrounds of envy management in the universities and enable professors for more effective roles without envy in the scientific climate of university to achieve in educational, research and servicing efficiency.

  1. Melanoma screening: Informing public health policy with quantitative modelling.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available Australia and New Zealand share the highest incidence rates of melanoma worldwide. Despite the substantial increase in public and physician awareness of melanoma in Australia over the last 30 years-as a result of the introduction of publicly funded mass media campaigns that began in the early 1980s -mortality has steadily increased during this period. This increased mortality has led investigators to question the relative merits of primary versus secondary prevention; that is, sensible sun exposure practices versus early detection. Increased melanoma vigilance on the part of the public and among physicians has resulted in large increases in public health expenditure, primarily from screening costs and increased rates of office surgery. Has this attempt at secondary prevention been effective? Unfortunately epidemiologic studies addressing the causal relationship between the level of secondary prevention and mortality are prohibitively difficult to implement-it is currently unknown whether increased melanoma surveillance reduces mortality, and if so, whether such an approach is cost-effective. Here I address the issue of secondary prevention of melanoma with respect to incidence and mortality (and cost per life saved by developing a Markov model of melanoma epidemiology based on Australian incidence and mortality data. The advantages of developing a methodology that can determine constraint-based surveillance outcomes are twofold: first, it can address the issue of effectiveness; and second, it can quantify the trade-off between cost and utilisation of medical resources on one hand, and reduced morbidity and lives saved on the other. With respect to melanoma, implementing the model facilitates the quantitative determination of the relative effectiveness and trade-offs associated with different levels of secondary and tertiary prevention, both retrospectively and prospectively. For example, I show that the surveillance enhancement that began in

  2. A Global Change in Higher Education: Entrepreneurial University Model

    Directory of Open Access Journals (Sweden)

    Süreyya SAKINÇ

    2012-01-01

    Full Text Available Universities are affected by the social and economic diversity stemmed from globalization and internationalization, and its functions, area of responsibility, organizational structure, funding capability respond this diversity. In today's knowledge society, different new concepts regarding the university education system such as Entrepreneur University, Corporation University, virtual university etc. have been emerged with wave of globalization effect. The rising competition in academic education and the mass demands for education prompt to universities to get seeking new funds for fixing their financial situation, and hit them transforming into entrepreneurial identity. The reflections of neoliberal approach in education have transformed the universities into the corporations which are much more focused on entrepreneurial, student-oriented and aimed to appropriate education and producing creative human resources for global development. In this study, a comprehensive evaluation will be carried on regarding the entrepreneur university model through the litterateur research to investigate its causes and factors that impact and improve it. The aim of the paper is to generate a framework that identifies dynamic processes of entrepreneur university model, dependently the litterateur syntheses. The contribution of the paper will depend on its consequent argument that entrepreneur university model is viable for Turkey. In this paper, the entrepreneur university model will be analyzed by Triple Helix phenomenon with the comparative approach.

  3. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting pa...

  4. Quantitative Analysis and Comparison of BMI among Han, Tibetan, and Uygur University Students in Northwest China

    OpenAIRE

    Jingya, Bai; Ye, He; Jing, Wang; Xi, Huanjiu; Tao, Hai

    2013-01-01

    Objectives. To fully analyze and compare BMI among Han, Tibetan, and Uygur university students, to discuss the differences in their physical properties and physical health, and thus to provide some theoretical suggestions for the improvement of students’ physical health. Methods. The cross-sectional random cluster sampling was used to investigate 10103 Han, Tibetan, and Uygur university students, aged 20–24 in Northwest China, and their height and weight were measured to calculate BMI. The BM...

  5. A Model for Mentoring University Faculty

    Science.gov (United States)

    Lumpkin, Angela

    2011-01-01

    Operational characteristics for successful mentoring programs of new university faculty include clarity of purpose of the program, methods for matching mentors and proteges, mentor training, mentor-protege relationship building, and program effectiveness assessment. Strengths of formal, informal, peer, group or consortia, intra-departmental,…

  6. Southwest University's No-Fee Teacher-Training Model

    Science.gov (United States)

    Chen, Shijian; Yang, Shuhan; Li, Linyuan

    2013-01-01

    The training model for Southwest University's no-fee teacher education program has taken shape over several years. Based on a review of the documentation and interviews with administrators and no-fee preservice students from different specialties, this article analyzes Southwest University's no-fee teacher-training model in terms of three main…

  7. Quantitative Analysis and Comparison of BMI among Han, Tibetan, and Uygur University Students in Northwest China

    Directory of Open Access Journals (Sweden)

    Bai Jingya

    2013-01-01

    Full Text Available Objectives. To fully analyze and compare BMI among Han, Tibetan, and Uygur university students, to discuss the differences in their physical properties and physical health, and thus to provide some theoretical suggestions for the improvement of students’ physical health. Methods. The cross-sectional random cluster sampling was used to investigate 10103 Han, Tibetan, and Uygur university students, aged 20–24 in Northwest China, and their height and weight were measured to calculate BMI. The BMI classification criteria for Chinese established by Work Group on Obesity in China (WGOC were used for screening. Results. Han, Tibetan, and Uygur university students show low obesity rates but high overweight rates. Han, Tibetan, and Uygur university students present a high rate of underweight, normal weight, and overweight, respectively. Female Han students show higher underweight and normal weight rates, but lower overweight and obesity rates, than male Han students. Female Tibetan students show higher normal weight rate, but lower overweight and obesity rates, than male Tibetan students. BMI increases with age for male students but decreases with age for female students. Male Uygur students show higher obesity rate than female Uygur students. Tibetan and Uygur university students have higher BMI than other minorities in South China.

  8. Quantitative Analysis and Comparison of BMI among Han, Tibetan, and Uygur University Students in Northwest China

    Science.gov (United States)

    Jingya, Bai; Ye, He; Jing, Wang; Xi, Huanjiu; Tao, Hai

    2013-01-01

    Objectives. To fully analyze and compare BMI among Han, Tibetan, and Uygur university students, to discuss the differences in their physical properties and physical health, and thus to provide some theoretical suggestions for the improvement of students' physical health. Methods. The cross-sectional random cluster sampling was used to investigate 10103 Han, Tibetan, and Uygur university students, aged 20–24 in Northwest China, and their height and weight were measured to calculate BMI. The BMI classification criteria for Chinese established by Work Group on Obesity in China (WGOC) were used for screening. Results. Han, Tibetan, and Uygur university students show low obesity rates but high overweight rates. Han, Tibetan, and Uygur university students present a high rate of underweight, normal weight, and overweight, respectively. Female Han students show higher underweight and normal weight rates, but lower overweight and obesity rates, than male Han students. Female Tibetan students show higher normal weight rate, but lower overweight and obesity rates, than male Tibetan students. BMI increases with age for male students but decreases with age for female students. Male Uygur students show higher obesity rate than female Uygur students. Tibetan and Uygur university students have higher BMI than other minorities in South China. PMID:24453807

  9. A tepid model for the early universe

    International Nuclear Information System (INIS)

    Carr, B.J.; Rees, M.J.

    1977-01-01

    If the Universe started off with a photon-to-baryon ratio much less than presently observed, massive black holes would have formed at early times even if the initial density fluctuations were very small. These holes could have generated the rest of the background radiation through accretion; in this way, such a Universe might automatically evolve to have the photon-to-baryon ratio observed today. This scenario could explain why the times of decoupling and matterradiation equilibrium are comparable and might provide a critical density of primordial black holes; it could also produce galaxies with black hole 'halos'. If the initial photon-to-baryon ratio was large enough, black hole formation would not occur: one would then have to invoke an alternative scenario in which the rest of the background radiation was generated by primordial stars at a comparatively recent epoch. (orig.) [de

  10. Interacting agegraphic dark energy models in non-flat universe

    International Nuclear Information System (INIS)

    Sheykhi, Ahmad

    2009-01-01

    A so-called 'agegraphic dark energy' was recently proposed to explain the dark energy-dominated universe. In this Letter, we generalize the agegraphic dark energy models to the universe with spatial curvature in the presence of interaction between dark matter and dark energy. We show that these models can accommodate w D =-1 crossing for the equation of state of dark energy. In the limiting case of a flat universe, i.e. k=0, all previous results of agegraphic dark energy in flat universe are restored.

  11. Quantitative Model for Supply Chain Visibility: Process Capability Perspective

    Directory of Open Access Journals (Sweden)

    Youngsu Lee

    2016-01-01

    Full Text Available Currently, the intensity of enterprise competition has increased as a result of a greater diversity of customer needs as well as the persistence of a long-term recession. The results of competition are becoming severe enough to determine the survival of company. To survive global competition, each firm must focus on achieving innovation excellence and operational excellence as core competency for sustainable competitive advantage. Supply chain management is now regarded as one of the most effective innovation initiatives to achieve operational excellence, and its importance has become ever more apparent. However, few companies effectively manage their supply chains, and the greatest difficulty is in achieving supply chain visibility. Many companies still suffer from a lack of visibility, and in spite of extensive research and the availability of modern technologies, the concepts and quantification methods to increase supply chain visibility are still ambiguous. Based on the extant researches in supply chain visibility, this study proposes an extended visibility concept focusing on a process capability perspective and suggests a more quantitative model using Z score in Six Sigma methodology to evaluate and improve the level of supply chain visibility.

  12. Planning Model for Peruvian University System

    OpenAIRE

    Chiyon, Isabel; Yague, Jose Luis

    2015-01-01

    This paper arises from observing the effect that the education policy has had on the European Higher Education Area that promotes the primary objective of this research: the preparation of a planning model that contributes, based on the European experience, the basic elements for the quality of higher education in Peru. To appraise the timeliness and usefulness of the aforementioned model, the scope of the Spanish model is selected and specifically adapted to the Peruvian model, which can be ...

  13. Performance management in universities : Effects of the transition to more quantitative measurement systems

    NARCIS (Netherlands)

    ter Bogt, H.J.; Scapens, R.W.

    2012-01-01

    The measurement of research and teaching performance is increasingly common within universities, driven probably by the rise of New Public Management (NPM). Although changing over time and varying from country to country, NPM involves the use of private sector methods in the public sector.

  14. Quantitative Analysis of Variables Affecting Nursing Program Completion at Arizona State University

    Science.gov (United States)

    Herrera, Cheryl

    2013-01-01

    This study is designed to understand the patterns of selection, preparation, retention and graduation of undergraduate pre-licensure clinical nursing students in the College of Nursing and Health Innovation at Arizona State University enrolled in 2007 and 2008. The resulting patterns may guide policy decision making regarding future cohorts in…

  15. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  16. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  17. Measuring organizational learning. Model testing in two Romanian universities

    OpenAIRE

    Alexandra Luciana Guţă

    2014-01-01

    The scientific literature associates organizational learning with superior organization performance. If we refer to the academic environment, we appreciate that it can develop and reach better levels of performance through changes driven from the inside. Thus, through this paper we elaborate on a conceptual model of organizational learning and we test the model on a sample of employees (university teachers and researchers) from two Romanian universities. The model comprises the process of org...

  18. University Start-ups: A Better Business Model

    Science.gov (United States)

    Dehn, J.; Webley, P. W.

    2015-12-01

    Many universities look to start-up companies as a way to attract faculty, supporting research and students as traditional federal sources become harder to come by. University affiliated start-up companies can apply for a broader suite of grants, as well as market their services to a broad customer base. Often university administrators see this as a potential panacea, but national statistics show this is not the case. Rarely do universities profit significantly from their start-ups. With a success rates of around 20%, most start-ups end up costing the university money as well as faculty-time. For the faculty, assuming they want to continue in academia, a start-up is often unattractive because it commonly leads out of academia. Running a successful business as well as maintaining a strong teaching and research load is almost impossible to do at the same time. Most business models and business professionals work outside of academia, and the models taught in business schools do not merge well in a university environment. To mitigate this a new business model is proposed where university start-ups are aligned with the academic and research missions of the university. A university start-up must work within the university, directly support research and students, and the work done maintaining the business be recognized as part of the faculty member's university obligations. This requires a complex conflict of interest management plan and for the companies to be non-profit in order to not jeopardize the university's status. This approach may not work well for all universities, but would be ideal for many to conserve resources and ensure a harmonious relationship with their start-ups and faculty.

  19. University staff adoption of iPads: An empirical study using an extended TAM model

    Directory of Open Access Journals (Sweden)

    Michael Steven Lane

    2014-11-01

    Full Text Available This research examined key factors influencing adoption of iPads by university staff. An online survey collected quantitative data to test hypothesised relationships in an extended TAM model. The findings show that university staff consider iPads easy to use and useful, with a high level of compatibility with their work. Social status had no influence on their attitude to using an iPad. However older university staff and university staff with no previous experience in using a similar technology such as an iPhone or smartphone found iPads less easy to use. Furthermore, a lack of formal end user ICT support impacted negatively on the use of iPads.

  20. Effect of Religious Beliefs on the Smoking Behaviour of University Students: Quantitative Findings From Malaysia.

    Science.gov (United States)

    Elkalmi, Ramadan M; Alkoudmani, Ramez M; Elsayed, Tarek M; Ahmad, Akram; Khan, Muhammad Umair

    2016-12-01

    The Malaysian official Islamic authorities have issued a "fatwa" (Islamic ruling) regarding smoking practice which prohibits Muslims from smoking because of its potential harm to health. Since the prevalence of smoking among Malaysian students is high, this study was designed to explore the perceptions and opinions of Malaysian Muslim students towards smoking in International Islamic University of Malaysia. A prospective, cross-sectional study was conducted among School of Science students in International Islamic University Malaysia. Convenience sampling approach was used to recruit 323 students based on sample size calculation. A content- and face-validated questionnaire was used to collect the data from the participants. Non-smokers highly supported the fatwa on smoking forbiddance than smokers (94 vs 64.3 %, p = 0.001). A significant proportion of non-smokers believed that Islam prohibits smoking because of its potential harm (94.9 vs 71.4 %, p = 0.001). Majority of smokers agreed that addiction is the main barrier towards smoking cessation (78.6 vs 61.5 %, p = 0.019). The results showed positive influences of Islamic beliefs on the non-smokers. Further studies are required to validate these findings by surveying other universities of Malaysia.

  1. Standard Model mass spectrum in inflationary universe

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics,60 Garden Street, Cambridge, MA 02138 (United States); Wang, Yi [Department of Physics, The Hong Kong University of Science and Technology,Clear Water Bay, Kowloon, Hong Kong (China); Xianyu, Zhong-Zhi [Center of Mathematical Sciences and Applications, Harvard University,20 Garden Street, Cambridge, MA 02138 (United States)

    2017-04-11

    We work out the Standard Model (SM) mass spectrum during inflation with quantum corrections, and explore its observable consequences in the squeezed limit of non-Gaussianity. Both non-Higgs and Higgs inflation models are studied in detail. We also illustrate how some inflationary loop diagrams can be computed neatly by Wick-rotating the inflation background to Euclidean signature and by dimensional regularization.

  2. Detection and quantitative analysis of ferrocyanide and ferricyanide: FY 93 Florida State University Raman spectroscopy report

    Energy Technology Data Exchange (ETDEWEB)

    Mann, C.K.; Vickers, T.J. [Florida State Univ., Tallahassee, FL (United States). Dept. of Chemistry

    1994-10-11

    This report provides a summary of work to develop and investigate the feasibility of using Raman spectroscopy with tank waste materials. It contains Raman spectra from organics, such as ethylenediaminetetraacetic acid (EDTA), hydroxyethylenediaminetetraacteic acid (HEDTA), imino diacetic acid (IDA), kerosene, tributyl phosphate (TBP), acetone and butanol, anticipated to be present in tank wastes and spectra from T-107 real and BY-104 simulant materials. The results of investigating Raman for determining moisture content in tank materials are also presented. A description of software algorithms developed to process Raman spectra from a dispersive grating spectrometer system and an in initial design for a data base to support qualitative and quantitative application of remote Raman sensing with tank wastes.

  3. Detection and quantitative analysis of ferrocyanide and ferricyanide: FY 93 Florida State University Raman spectroscopy report

    International Nuclear Information System (INIS)

    Mann, C.K.; Vickers, T.J.

    1994-01-01

    This report provides a summary of work to develop and investigate the feasibility of using Raman spectroscopy with tank waste materials. It contains Raman spectra from organics, such as ethylenediaminetetraacetic acid (EDTA), hydroxyethylenediaminetetraacteic acid (HEDTA), imino diacetic acid (IDA), kerosene, tributyl phosphate (TBP), acetone and butanol, anticipated to be present in tank wastes and spectra from T-107 real and BY-104 simulant materials. The results of investigating Raman for determining moisture content in tank materials are also presented. A description of software algorithms developed to process Raman spectra from a dispersive grating spectrometer system and an in initial design for a data base to support qualitative and quantitative application of remote Raman sensing with tank wastes

  4. Quantitative assessment of bio-aerosols contamination in indoor air of University dormitory rooms.

    Science.gov (United States)

    Hayleeyesus, Samuel Fekadu; Ejeso, Amanuel; Derseh, Fikirte Aklilu

    2015-07-01

    The purpose of this study is to provide insight into how students are exposed to indoor bio-aerosols in the dormitory rooms and to figure out the major possible factors that govern the contamination levels. The Bio-aerosols concentration level of indoor air of thirty dormitory rooms of Jimma University was determined by taking 120 samples. Passive air sampling technique; the settle plate method using open Petri-dishes containing different culture media was employed to collect sample twice daily. The range of bio-aerosols contamination detected in the dormitory rooms was 511-9960 CFU/m(3) for bacterial and 531-6568 CFU/m(3) for fungi. Based on the criteria stated by WHO expert group, from the total 120 samples 95 of the samples were above the recommended level. The statistical analysis showed that, occupancy were significantly affected the concentrations of bacteria that were measured in all dormitory rooms at 6:00 am sampling time (p-value=0.000) and also the concentrations of bacteria that were measured in all dormitory rooms were significantly different to each other (p-value=0.013) as of their significance difference in occupancy (p-value=0.000). Moreover, there were a significant different on the contamination level of bacteria at 6:00 am and 7:00 pm sampling time (p=0.015), whereas there is no significant difference for fungi contamination level for two sampling times (p= 0.674). There is excessive bio-aerosols contaminant in indoor air of dormitory rooms of Jimma University and human occupancy produces a marked concentration increase of bacterial contamination levels and most fungi species present into the rooms air of Jimma University dormitory were not human-borne.

  5. Game Based Learning (GBL) adoption model for universities: cesim ...

    African Journals Online (AJOL)

    Game Based Learning (GBL) adoption model for universities: cesim simulation. ... The global market has escalated the need of Game Based Learning (GBL) to offer a wide range of courses since there is a ... AJOL African Journals Online.

  6. Are Universities Role Models for Communities? A Gender Perspective

    OpenAIRE

    Felicia Cornelia MACARIE; Octavian MOLDOVAN

    2012-01-01

    The present paper explores the degree in which universities could/should serve as role models for communities from the perspective of gender integration. Although the theoretical/ moral answer would be affirmative (universities should be in such a position that would allow local communities to regard them as role models of gender integration), the primary empirical analysis leads to another conclusion. A brief theoretical review (that connects gender discrimination, sustainable development, u...

  7. A universal throw model and its applications

    NARCIS (Netherlands)

    Voort, M.M. van der; Doormaal, J.C.A.M. van; Verolme, E.K.; Weerheijm, J.

    2008-01-01

    A deterministic model has been developed that describes the throw of debris or fragments from a source with an arbitrary geometry and for arbitrary initial conditions. The initial conditions are defined by the distributions of mass, launch velocity and launch direction. The item density in an

  8. University Business Models and Online Practices: A Third Way

    Science.gov (United States)

    Rubin, Beth

    2013-01-01

    Higher Education is in a state of change, and the existing business models do not meet the needs of stakeholders. This article contrasts the current dominant business models of universities, comparing the traditional non-profit against the for-profit online model, examining the structural features and online teaching practices that underlie each.…

  9. Interpretation of protein quantitation using the Bradford assay: comparison with two calculation models.

    Science.gov (United States)

    Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung

    2013-03-01

    The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. DEVELOPING A SEVEN METAPHORS MODEL OF MARKETING FOR UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    COITA Dorin-Cristian

    2014-12-01

    Full Text Available The concept of marketing applied in education offers a lot of possibilities of social innovation. It is a tool helping educational organization to acquire resources and to provide value. In this article presented a model of seven metaphors to be used by a universities in order to acquire resources and to provide value to its stakeholders and applied it in the case of a Romanian university called The University. The aim of the paper is to identify sources of social innovations by using this model in the field of educational marketing.

  11. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  12. DISTANCE AS KEY FACTOR IN MODELLING STUDENTS’ RECRUITMENT BY UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    SIMONA MĂLĂESCU

    2015-10-01

    Full Text Available Distance as Key Factor in Modelling Students’ Recruitment by Universities. In a previous paper analysing the challenge of keeping up with the current methodologies in the analysis and modelling of students’ recruitment by universities in the case of some ECE countries which still don’t register or develop key data to take advantage from the state of the art knowledge on the domain, we have promised to approach the factor distance in a future work due to the extent of the topic. This paper fulfill that promise bringing a review of the literature especially dealing with modelling the geographical area of recruiting students of an university, where combining distance with the proximate key factors previously reviewed, complete the meta-analysis of existing literature we have started a year ago. Beyond the theoretical benefit from a practical perspective, the metaanalysis aimed at synthesizing elements of good practice that can be applied to the local university system.

  13. Quantitative modelling and analysis of a Chinese smart grid: a stochastic model checking case study

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    Cyber-physical systems integrate information and communication technology with the physical elements of a system, mainly for monitoring and controlling purposes. The conversion of traditional power grid into a smart grid, a fundamental example of a cyber-physical system, raises a number of issues...... that require novel methods and applications. One of the important issues in this context is the verification of certain quantitative properties of the system. In this paper, we consider a specific Chinese smart grid implementation as a case study and address the verification problem for performance and energy...... consumption.We employ stochastic model checking approach and present our modelling and analysis study using PRISM model checker....

  14. Modeling Factors with Influence on Sustainable University Management

    Directory of Open Access Journals (Sweden)

    Oana Dumitrascu

    2015-01-01

    Full Text Available The main objective of this paper is to present the factors with influence on the sustainable university management and the relationships between them. In the scientific approach we begin from a graphical model, according to which the extracurricular activities together with internal environmental factors influence students’ involvement in such activities, the university attractiveness, their academic performance and their integration into the socially-economic and natural environment (components related with sustainable development. The model emphasizes that individual performances, related to students’ participation in extracurricular activities, have a positive influence on the sustainability of university management. The results of the study have shown that the university sustainability may be influenced by a number of factors, such as students’ performance, students’ involvement in extracurricular activities or university’s attractiveness and can in turn influence implicitly also the sustainability of university management. The originality of the paper consists in the relationships study using the modeling method in general and informatics tools of modeling in particular, as well as through graphical visualization of some influences, on the sustainability university management.

  15. Inflationary universe models and the formation of structure

    International Nuclear Information System (INIS)

    Brandenberger, R.H.

    1987-01-01

    The main features of inflationary universe models are briefly reviewed. Inflation provides a mechanism which produces energy density fluctuations on cosmological scales. In the original models, it was not possible to obtain the correct magnitude of these fluctuations without fine tuning the particle physics models. Two mechanisms, chaotic inflation, and a dynamical relaxation process are discussed by which inflation may be realized in models which give the right magnitude of fluctuations. 22 references

  16. Universal correlators for multi-arc complex matrix models

    International Nuclear Information System (INIS)

    Akemann, G.

    1997-01-01

    The correlation functions of the multi-arc complex matrix model are shown to be universal for any finite number of arcs. The universality classes are characterized by the support of the eigenvalue density and are conjectured to fall into the same classes as the ones recently found for the Hermitian model. This is explicitly shown to be true for the case of two arcs, apart from the known result for one arc. The basic tool is the iterative solution of the loop equation for the complex matrix model with multiple arcs, which provides all multi-loop correlators up to an arbitrary genus. Explicit results for genus one are given for any number of arcs. The two-arc solution is investigated in detail, including the double-scaling limit. In addition universal expressions for the string susceptibility are given for both the complex and Hermitian model. (orig.)

  17. A 3 + 1 Regge calculus model of the Taub universe

    International Nuclear Information System (INIS)

    Tuckey, P.A.

    1988-01-01

    The Piran and Williams [1986 Phys. Rev. D 33,1622] second-order formulation of 3 + 1 Regge calculus is used to calculate the evolution of a model of the Taub universe. The model displays qualitatively the correct behaviour, thereby giving some verification of the 3 + 1 formulation. (author)

  18. Proven collaboration model for impact generating research with universities

    CSIR Research Space (South Africa)

    Bezuidenhout, DF

    2010-09-01

    Full Text Available -optics, image processing and computer vision. This paper presents the research collaboration model with universities that has ensured the PRISM programme's success. It is shown that this collaboration model has resulted in a pipeline of highly-skilled people...

  19. Integrating an Interprofessional Education Model at a Private University

    Science.gov (United States)

    Parker, Ramona Ann; Gottlieb, Helmut; Dominguez, Daniel G.; Sanchez-Diaz, Patricia C.; Jones, Mary Elaine

    2015-01-01

    In 2012, a private University in South Texas sought to prepare eight cohorts of 25 nursing, optometry, pharmacy, physical therapy, and health care administration students with an interprofessional education activity as a model for collaborative learning. The two semester interprofessional activity used a blended model (Blackboard Learn®,…

  20. Are Universities Role Models for Communities? A Gender Perspective

    Directory of Open Access Journals (Sweden)

    Felicia Cornelia MACARIE

    2012-12-01

    Full Text Available The present paper explores the degree in which universities could/should serve as role models for communities from the perspective of gender integration. Although the theoretical/ moral answer would be affirmative (universities should be in such a position that would allow local communities to regard them as role models of gender integration, the primary empirical analysis leads to another conclusion. A brief theoretical review (that connects gender discrimination, sustainable development, universities and local communities is followed by an empirical analysis that compares the management structures of 12 Romanian Universities of Advanced Research and Education (the best Romanian universities according to a national ranking with those of four local communities where they are located (as geographic proximity would lead to a better diffusion of best practices. Contrary to initial expectations, even in higher education institutions, women are underrepresented both in executive and legislative positions. Since universities are subject to the same major patterns of gender discrimination (such as role theory, glass ceiling and glass elevator as private and public organizations, they lose the moral high ground that theory would suggest. However, medicine and pharmacy universities that can be connected with the traditional roles attributed to women provide better gender integration, but glass escalator phenomena remain present even in these limited fields.

  1. A time-symmetric Universe model and its observational implication

    International Nuclear Information System (INIS)

    Futamase, T.; Matsuda, T.

    1987-01-01

    A time-symmetric closed-universe model is discussed in terms of the radiation arrow of time. The time symmetry requires the occurrence of advanced waves in the recontracting phase of the Universe. The observational consequences of such advanced waves are considered, and it is shown that a test observer in the expanding phase can observe a time-reversed image of a source of radiation in the future recontracting phase

  2. Time-symmetric universe model and its observational implication

    Energy Technology Data Exchange (ETDEWEB)

    Futamase, T.; Matsuda, T.

    1987-08-01

    A time-symmetric closed-universe model is discussed in terms of the radiation arrow of time. The time symmetry requires the occurrence of advanced waves in the recontracting phase of the Universe. We consider the observational consequences of such advanced waves, and it is shown that a test observer in the expanding phase can observe a time-reversed image of a source of radiation in the future recontracting phase.

  3. University-Industry Research Collaboration: A Model to Assess University Capability

    Science.gov (United States)

    Abramo, Giovanni; D'Angelo, Ciriaco Andrea; Di Costa, Flavia

    2011-01-01

    Scholars and policy makers recognize that collaboration between industry and the public research institutions is a necessity for innovation and national economic development. This work presents an econometric model which expresses the university capability for collaboration with industry as a function of size, location and research quality. The…

  4. Faculties of Education in Traditional Universities and Universities of the Third Age: A Partnership Model in Gerontagogy

    Science.gov (United States)

    Lemieux, Andre; Boutin, Gerald; Riendeau, Jean

    2007-01-01

    This article discusses "Universities of the Third Age", whose function is quite distinct from established universities' traditional role in teaching, research, and community services. Consequently, there is an urgent need to develop a model of partnership between traditional universities and Universities of the Third Age, ensuring better…

  5. Defect evolution in cosmology and condensed matter quantitative analysis with the velocity-dependent one-scale model

    CERN Document Server

    Martins, C J A P

    2016-01-01

    This book sheds new light on topological defects in widely differing systems, using the Velocity-Dependent One-Scale Model to better understand their evolution. Topological defects – cosmic strings, monopoles, domain walls or others - necessarily form at cosmological (and condensed matter) phase transitions. If they are stable and long-lived they will be fossil relics of higher-energy physics. Understanding their behaviour and consequences is a key part of any serious attempt to understand the universe, and this requires modelling their evolution. The velocity-dependent one-scale model is the only fully quantitative model of defect network evolution, and the canonical model in the field. This book provides a review of the model, explaining its physical content and describing its broad range of applicability.

  6. A Review of Quantitative Situation Assessment Models for Nuclear Power Plant Operators

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Situation assessment is the process of developing situation awareness and situation awareness is defined as 'the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.' Situation awareness is an important element influencing human actions because human decision making is based on the result of situation assessment or situation awareness. There are many models for situation awareness and those models can be categorized into qualitative or quantitative. As the effects of some input factors on situation awareness can be investigated through the quantitative models, the quantitative models are more useful for the design of operator interfaces, automation strategies, training program, and so on, than the qualitative models. This study presents the review of two quantitative models of situation assessment (SA) for nuclear power plant operators

  7. Improvement of the ID model for quantitative network data

    DEFF Research Database (Denmark)

    Sørensen, Peter Borgen; Damgaard, Christian Frølund; Dupont, Yoko Luise

    2015-01-01

    Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks. Such artefa......Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks......)1. This presentation will illustrate the application of the ID method based on a data set which consists of counts of visits by 152 pollinator species to 16 plant species. The method is based on two definitions of the underlying probabilities for each combination of pollinator and plant species: (1), pi...... reproduce the high number of zero valued cells in the data set and mimic the sampling distribution. 1 Sørensen et al, Journal of Pollination Ecology, 6(18), 2011, pp129-139...

  8. Cloud Computing Adoption Model for Universities to Increase ICT Proficiency

    Directory of Open Access Journals (Sweden)

    Safiya Okai

    2014-08-01

    Full Text Available Universities around the world especially those in developing countries are faced with the problem of delivering the level of information and communications technology (ICT needed to facilitate teaching, learning, research, and development activities ideal in a typical university, which is needed to meet educational needs in-line with advancement in technology and the growing dependence on IT. This is mainly due to the high cost involved in providing and maintaining the needed hardware and software. A technology such as cloud computing that delivers on demand provisioning of IT resources on a pay per use basis can be used to address this problem. Cloud computing promises better delivery of IT services as well as availability whenever and wherever needed at reduced costs with users paying only as much as they consume through the services of cloud service providers. The cloud technology reduces complexity while increasing speed and quality of IT services provided; however, despite these benefits the challenges that come with its adoption have left many sectors especially the higher education skeptical in committing to this technology. This article identifies the reasons for the slow rate of adoption of cloud computing at university level, discusses the challenges faced and proposes a cloud computing adoption model that contains strategic guidelines to overcome the major challenges identified and a roadmap for the successful adoption of cloud computing by universities. The model was tested in one of the universities and found to be both useful and appropriate for adopting cloud computing at university level.

  9. A general mixture model for mapping quantitative trait loci by using molecular markers

    NARCIS (Netherlands)

    Jansen, R.C.

    1992-01-01

    In a segregating population a quantitative trait may be considered to follow a mixture of (normal) distributions, the mixing proportions being based on Mendelian segregation rules. A general and flexible mixture model is proposed for mapping quantitative trait loci (QTLs) by using molecular markers.

  10. Sloppy-model universality class and the Vandermonde matrix.

    Science.gov (United States)

    Waterfall, Joshua J; Casey, Fergal P; Gutenkunst, Ryan N; Brown, Kevin S; Myers, Christopher R; Brouwer, Piet W; Elser, Veit; Sethna, James P

    2006-10-13

    In a variety of contexts, physicists study complex, nonlinear models with many unknown or tunable parameters to explain experimental data. We explain why such systems so often are sloppy: the system behavior depends only on a few "stiff" combinations of the parameters and is unchanged as other "sloppy" parameter combinations vary by orders of magnitude. We observe that the eigenvalue spectra for the sensitivity of sloppy models have a striking, characteristic form with a density of logarithms of eigenvalues which is roughly constant over a large range. We suggest that the common features of sloppy models indicate that they may belong to a common universality class. In particular, we motivate focusing on a Vandermonde ensemble of multiparameter nonlinear models and show in one limit that they exhibit the universal features of sloppy models.

  11. Internal Universes in Models of Homotopy Type Theory

    DEFF Research Database (Denmark)

    Licata, Daniel R.; Orton, Ian; Pitts, Andrew M.

    2018-01-01

    We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language with a mo...... that the interval in cubical sets does indeed have. This leads to a completely internal development of models of homotopy type theory within what we call crisp type theory.......We show that universes of fibrations in various models of homotopy type theory have an essentially global character: they cannot be described in the internal language of the presheaf topos from which the model is constructed. We get around this problem by extending the internal language...

  12. Development of a universal dual-bolus injection scheme for the quantitative assessment of myocardial perfusion cardiovascular magnetic resonance

    Directory of Open Access Journals (Sweden)

    Alfakih Khaled

    2011-05-01

    Full Text Available Abstract Background The dual-bolus protocol enables accurate quantification of myocardial blood flow (MBF by first-pass perfusion cardiovascular magnetic resonance (CMR. However, despite the advantages and increasing demand for the dual-bolus method for accurate quantification of MBF, thus far, it has not been widely used in the field of quantitative perfusion CMR. The main reasons for this are that the setup for the dual-bolus method is complex and requires a state-of-the-art injector and there is also a lack of post processing software. As a solution to one of these problems, we have devised a universal dual-bolus injection scheme for use in a clinical setting. The purpose of this study is to show the setup and feasibility of the universal dual-bolus injection scheme. Methods The universal dual-bolus injection scheme was tested using multiple combinations of different contrast agents, contrast agent dose, power injectors, perfusion sequences, and CMR scanners. This included 3 different contrast agents (Gd-DO3A-butrol, Gd-DTPA and Gd-DOTA, 4 different doses (0.025 mmol/kg, 0.05 mmol/kg, 0.075 mmol/kg and 0.1 mmol/kg, 2 different types of injectors (with and without "pause" function, 5 different sequences (turbo field echo (TFE, balanced TFE, k-space and time (k-t accelerated TFE, k-t accelerated balanced TFE, turbo fast low-angle shot and 3 different CMR scanners from 2 different manufacturers. The relation between the time width of dilute contrast agent bolus curve and cardiac output was obtained to determine the optimal predefined pause duration between dilute and neat contrast agent injection. Results 161 dual-bolus perfusion scans were performed. Three non-injector-related technical errors were observed (1.9%. No injector-related errors were observed. The dual-bolus scheme worked well in all the combinations of parameters if the optimal predefined pause was used. Linear regression analysis showed that the optimal duration for the predefined

  13. A Physical – Geometrical Model of an Early Universe

    Directory of Open Access Journals (Sweden)

    Corneliu BERBENTE

    2014-12-01

    Full Text Available A physical-geometrical model for a possible early universe is proposed. One considers an initial singularity containing the energy of the whole universe. The singularity expands as a spherical wave at the speed of light generating space and time. The relations of the special theory of relativity, quantum mechanics and gas kinetics are considered applicable. A structuring of the primary wave is adopted on reasons of geometrical simplicity as well as on satisfying the conservation laws. The evolution is able to lead to particles very close to neutrons as mass and radius. The actually admitted values for the radius and mass of the universe as well as the temperature of the ground radiation (3-5 K can be obtained by using the proposed model.

  14. Universal and blocking primer mismatches limit the use of high-throughput DNA sequencing for the quantitative metabarcoding of arthropods.

    Science.gov (United States)

    Piñol, J; Mir, G; Gomez-Polo, P; Agustí, N

    2015-07-01

    The quantification of the biological diversity in environmental samples using high-throughput DNA sequencing is hindered by the PCR bias caused by variable primer-template mismatches of the individual species. In some dietary studies, there is the added problem that samples are enriched with predator DNA, so often a predator-specific blocking oligonucleotide is used to alleviate the problem. However, specific blocking oligonucleotides could coblock nontarget species to some degree. Here, we accurately estimate the extent of the PCR biases induced by universal and blocking primers on a mock community prepared with DNA of twelve species of terrestrial arthropods. We also compare universal and blocking primer biases with those induced by variable annealing temperature and number of PCR cycles. The results show that reads of all species were recovered after PCR enrichment at our control conditions (no blocking oligonucleotide, 45 °C annealing temperature and 40 cycles) and high-throughput sequencing. They also show that the four factors considered biased the final proportions of the species to some degree. Among these factors, the number of primer-template mismatches of each species had a disproportionate effect (up to five orders of magnitude) on the amplification efficiency. In particular, the number of primer-template mismatches explained most of the variation (~3/4) in the amplification efficiency of the species. The effect of blocking oligonucleotide concentration on nontarget species relative abundance was also significant, but less important (below one order of magnitude). Considering the results reported here, the quantitative potential of the technique is limited, and only qualitative results (the species list) are reliable, at least when targeting the barcoding COI region. © 2014 John Wiley & Sons Ltd.

  15. Explaining formation of Astronomical Jets using Dynamic Universe Model

    Science.gov (United States)

    Naga Parameswara Gupta, Satyavarapu

    2016-07-01

    Astronomical jets are observed from the centres of many Galaxies including our own Milkyway. The formation of such jet is explained using SITA simulations of Dynamic Universe Model. For this purpose the path traced by a test neutron is calculated and depicted using a set up of one densemass of the mass equivalent to mass of Galaxy center, 90 stars with similar masses of stars near Galaxy center, mass equivalents of 23 Globular Cluster groups, 16 Milkyway parts, Andromeda and Triangulum Galaxies at appropriate distances. Five different kinds of theoretical simulations gave positive results The path travelled by this test neutron was found to be an astronomical jet emerging from Galaxy center. This is another result from Dynamic Universe Model. It solves new problems like a. Variable Mass Rocket Trajectory Problem b. Explaining Very long baseline interferometry (VLBI) observations c. Astronomical jets observed from Milkyway Center d. Prediction of Blue shifted Galaxies e. Explaining Pioneer Anomaly f. Prediction of New Horizons satellite trajectory etc. Dynamic Universe Model never reduces to General relativity on any condition. It uses a different type of mathematics based on Newtonian physics. This mathematics used here is simple and straightforward. As there are no differential equations present in Dynamic Universe Model, the set of equations give single solution in x y z Cartesian coordinates for every point mass for every time step

  16. On distinguishing different models of a class of emergent Universe ...

    Indian Academy of Sciences (India)

    Souvik Ghose

    2018-02-20

    Feb 20, 2018 ... the same class of EU in light of union compilation data (SNIa) which consists of over a hundred data points, thus ... Dark energy; emergent Universe; observational data. .... μ vs. z curve for different EU models along with the.

  17. Modeling Environmental Literacy of Malaysian Pre-University Students

    Science.gov (United States)

    Shamuganathan, Sheila; Karpudewan, Mageswary

    2015-01-01

    In this study attempt was made to model the environmental literacy of Malaysian pre-university students enrolled in a matriculation college. Students enrolled in the matriculation colleges in Malaysia are the top notch students in the country. Environmental literacy of this group is perceived important because in the future these students will be…

  18. Changing the Business Model of a Distance Teaching University

    NARCIS (Netherlands)

    Koper, Rob

    2014-01-01

    Reference: Koper, E.J.R. (2014) Changing the Business Model of a Distance Teaching University. In R. Huang, Kinshuk, Price, J.K. (eds.), ICT in Education in Global Context: emerging trends report 2013-2014, Lecture Notes in Educational Technology, Heidelberg: Springer Verlag, pp. 185-203 ISBN

  19. The University Model and Educational Change. SSEC Publication No. 130.

    Science.gov (United States)

    Ford, Richard B.

    In the sixties the crisis of the credibility and competence of schools resulted in the funding of programs to remedy school problems. The model for curriculum reform came from the university and, more particularly, from liberal arts departments having the capacity to improve curriculum content and teacher expertise. In a few instances attempts…

  20. A Universal Model for the Normative Evaluation of Internet Information.

    NARCIS (Netherlands)

    Spence, E.H.

    2009-01-01

    Beginning with the initial premise that as the Internet has a global character, the paper will argue that the normative evaluation of digital information on the Internet necessitates an evaluative model that is itself universal and global in character (I agree, therefore, with Gorniak- Kocikowska’s

  1. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  2. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  3. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  4. University education: From Humbolt's model to the Bologna process

    Directory of Open Access Journals (Sweden)

    Bodroški-Spariosu Biljana

    2015-01-01

    Full Text Available The characteristics of the European university education in the context of the Bologna process are the topic of this article. The aim is to analyze the key issues in university education in comparison to the classic or Humbolt's model. In the periods of extensive reforms of high education it is important to review the place and role of the university from the standpoint of institutional characteristics, a dominant educational orientation and attitudes towards society. The Bologna process initiated three key changes in the European system of university education: a the change of institutional framework - from the binary to the so called uniquely diversified system; b dominant orientation - instead of science the student is in the centre of education; c the social role of the university - from the development of science and impartial critique of the society towards providing educational services to the market. The pedagogic implications of these changes open the questions of the purpose of education, relations between professors and students and the identity of the modern university itself.

  5. Linear regression models for quantitative assessment of left ...

    African Journals Online (AJOL)

    Changes in left ventricular structures and function have been reported in cardiomyopathies. No prediction models have been established in this environment. This study established regression models for prediction of left ventricular structures in normal subjects. A sample of normal subjects was drawn from a large urban ...

  6. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  7. Towards the quantitative evaluation of visual attention models.

    Science.gov (United States)

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Universe before Planck time: A quantum gravity model

    International Nuclear Information System (INIS)

    Padmanabhan, T.

    1983-01-01

    A model for quantum gravity can be constructed by treating the conformal degree of freedom of spacetime as a quantum variable. An isotropic, homogeneous cosmological solution in this quantum gravity model is presented. The spacetime is nonsingular for all the three possible values of three-space curvature, and agrees with the classical solution for time scales larger than the Planck time scale. A possibility of quantum fluctuations creating the matter in the universe is suggested

  9. The Development of an Intelligent Leadership Model for State Universities

    OpenAIRE

    Aleme Keikha; Reza Hoveida; Nour Mohammad Yaghoubi

    2017-01-01

    Higher education and intelligent leadership are considered important parts of every country’s education system, which could potentially play a key role in accomplishing the goals of society. In theories of leadership, new patterns attempt to view leadership through the prism of creative and intelligent phenomena. This paper aims to design and develop an intelligent leadership model for public universities. A qualitativequantitative research method was used to design a basic model of intellige...

  10. Modelling the implications of moving towards universal coverage in Tanzania.

    Science.gov (United States)

    Borghi, Josephine; Mtei, Gemini; Ally, Mariam

    2012-03-01

    A model was developed to assess the impact of possible moves towards universal coverage in Tanzania over a 15-year time frame. Three scenarios were considered: maintaining the current situation ('the status quo'); expanded health insurance coverage (the estimated maximum achievable coverage in the absence of premium subsidies, coverage restricted to those who can pay); universal coverage to all (government revenues used to pay the premiums for the poor). The model estimated the costs of delivering public health services and all health services to the population as a proportion of Gross Domestic Product (GDP), and forecast revenue from user fees and insurance premiums. Under the status quo, financial protection is provided to 10% of the population through health insurance schemes, with the remaining population benefiting from subsidized user charges in public facilities. Seventy-six per cent of the population would benefit from financial protection through health insurance under the expanded coverage scenario, and 100% of the population would receive such protection through a mix of insurance cover and government funding under the universal coverage scenario. The expanded and universal coverage scenarios have a significant effect on utilization levels, especially for public outpatient care. Universal coverage would require an initial doubling in the proportion of GDP going to the public health system. Government health expenditure would increase to 18% of total government expenditure. The results are sensitive to the cost of health system strengthening, the level of real GDP growth, provider reimbursement rates and administrative costs. Promoting greater cross-subsidization between insurance schemes would provide sufficient resources to finance universal coverage. Alternately, greater tax funding for health could be generated through an increase in the rate of Value-Added Tax (VAT) or expanding the income tax base. The feasibility and sustainability of efforts to

  11. Hidden Markov Model for quantitative prediction of snowfall and ...

    Indian Academy of Sciences (India)

    J. Earth Syst. Sci. (2017) 126: 33 ... ogy, climate change, glaciology and crop models in agriculture. Different ... In areas where local topography strongly influences precipitation .... (vii) cloud amount, (viii) cloud type and (ix) sun shine hours.

  12. THE MODEL OF UNIVERSAL BANKING SUPERMARKET IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Tatiana Manolievna GORDITSA

    2017-06-01

    Full Text Available The article deals with the author's conceptual approach to the multiple scientific concepts of both traditional and universal banking service moreover it shows the level of transformation of the latter to the model of the finance supermarket – the top of the modern retail banking, a structure that was formed due to globalization of the finance-credit industry. The article analyses the category of “finance supermarket” and brings out a common idea considering the main features of the mentioned organization model of banking service. The main features include: 1. Complex banking service satisfying the customers` needs; 2. The Bundling of banking and financial products (services; 3. Product line extension, standardization and large scale character of sale; 4. Remote banking. Bundling of the products (services introduced in this model allows the maximal integration of the finance services, operations and products including banking, consulting, insurance, investment services at the same office. Analysis of the scientific literature shows that the organization structure of the servicing in a Ukrainian universal bank mostly associates the model of a finance supermarket. However, current restrictions of the Ukrainian legal system and the existence of the certain transition level, caused by gradual application of the innovations of both financial and technological origin (evolutionary-innovative development are not taken into account. Looking from this angle, the author describes a transition model – from a universal bank to a financial supermarket, a universal banking supermarket. The model`s distinctive feature is the application of the improved technological service, that induced the transformation of modern banking operations, services and products in Ukraine from simplest to complex.

  13. Models for universal reduction of macroscopic quantum fluctuations

    International Nuclear Information System (INIS)

    Diosi, L.

    1988-10-01

    If quantum mechanics is universal, then macroscopic bodies would, in principle, possess macroscopic quantum fluctuations (MQF) in their positions, orientations, densities etc. Such MQF, however, are not observed in nature. The hypothesis is adopted that the absence of MQF is due to a certain universal mechanism. Gravitational measures were applied for reducing MQF of the mass density. This model leads to classical trajectories in the macroscopic limit of translational motion. For massive objects, unwanted macroscopic superpositions of quantum states will be destroyed within short times. (R.P.) 34 refs

  14. Designs that make a difference: the Cardiac Universal Bed model.

    Science.gov (United States)

    Johnson, Jackie; Brown, Katherine Kay; Neal, Kelly

    2003-01-01

    Information contained in this article includes some of the findings from a joint research project conducted by Corazon Consulting and Ohio State University Medical Center on national trends in Cardiac Universal Bed (CUB) utilization. This article outlines current findings and "best practice" standards related to the benefits of developing care delivery models to differentiate an organization with a competitive advantage in the highly dynamic marketplace of cardiovascular care. (OSUMC, a Corazon client, is incorporating the CUB into their Ross Heart Hospital slated to open this spring.)

  15. Establishing a business process reference model for Universities

    DEFF Research Database (Denmark)

    Svensson, Carsten; Hvolby, Hans-Henrik

    2012-01-01

    Modern universities are by any standard complex organizations that, from an IT perspective, present a number of unique challenges. This paper will propose establishing a business process reference framework. The benefit to the users would be a better understanding of the system landscape, business......) have gained popularity among organizations in both the private and public sectors. We speculate that this success can be replicated in a university setting. Furthermore the paper will outline how the research group suggests moving ahead with the research which will lead to a reference model....

  16. Formation of a ''child'' universe in an inflationary cosmological model

    International Nuclear Information System (INIS)

    Holcomb, K.A.; Park, S.J.; Vishniac, E.T.

    1989-01-01

    The evolution of a flat, spherically symmetric cosmological model, containing radiation and an inhomogeneous scalar field, is simulated numerically to determine whether the inhomogeneity could cause a ''child'' universe, connected by a wormhole to the external universe, to form. The gravitational and field quantities were computed self-consistently by means of the techniques of numerical relativity. Although we were unable to follow the process to its completion, preliminary indications are that the ''budding'' phenomenon could occur under very general initial conditions, as long as the scalar field is sufficiently inhomogeneous that the wormhole forms before the inflation is damped by the expansion of the background spacetime

  17. Probing Models of Dark Matter and the Early Universe

    Science.gov (United States)

    Orlofsky, Nicholas David

    This thesis discusses models for dark matter (DM) and their behavior in the early universe. An important question is how phenomenological probes can directly search for signals of DM today. Another topic of investigation is how the DM and other processes in the early universe must evolve. Then, astrophysical bounds on early universe dynamics can constrain DM. We will consider these questions in the context of three classes of DM models--weakly interacting massive particles (WIMPs), axions, and primordial black holes (PBHs). Starting with WIMPs, we consider models where the DM is charged under the electroweak gauge group of the Standard Model. Such WIMPs, if generated by a thermal cosmological history, are constrained by direct detection experiments. To avoid present or near-future bounds, the WIMP model or cosmological history must be altered in some way. This may be accomplished by the inclusion of new states that coannihilate with the WIMP or a period of non-thermal evolution in the early universe. Future experiments are likely to probe some of these altered scenarios, and a non-observation would require a high degree of tuning in some of the model parameters in these scenarios. Next, axions, as light pseudo-Nambu-Goldstone bosons, are susceptible to quantum fluctuations in the early universe that lead to isocurvature perturbations, which are constrained by observations of the cosmic microwave background (CMB). We ask what it would take to allow axion models in the face of these strong CMB bounds. We revisit models where inflationary dynamics modify the axion potential and discuss how isocurvature bounds can be relaxed, elucidating the difficulties in these constructions. Avoiding disruption of inflationary dynamics provides important limits on the parameter space. Finally, PBHs have received interest in part due to observations by LIGO of merging black hole binaries. We ask how these PBHs could arise through inflationary models and investigate the opportunity

  18. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Science.gov (United States)

    2012-07-17

    ... models to generate quantitative estimates of the benefits and risks of influenza vaccination. The public...] Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A... Influenza Disease Models to Quantitatively Evaluate the Benefits and Risks of Vaccines: A Technical Workshop...

  19. Exploiting linkage disequilibrium in statistical modelling in quantitative genomics

    DEFF Research Database (Denmark)

    Wang, Lei

    Alleles at two loci are said to be in linkage disequilibrium (LD) when they are correlated or statistically dependent. Genomic prediction and gene mapping rely on the existence of LD between gentic markers and causul variants of complex traits. In the first part of the thesis, a novel method...... to quantify and visualize local variation in LD along chromosomes in describet, and applied to characterize LD patters at the local and genome-wide scale in three Danish pig breeds. In the second part, different ways of taking LD into account in genomic prediction models are studied. One approach is to use...... the recently proposed antedependence models, which treat neighbouring marker effects as correlated; another approach involves use of haplotype block information derived using the program Beagle. The overall conclusion is that taking LD information into account in genomic prediction models potentially improves...

  20. The place of quantitative energy models in a prospective approach

    International Nuclear Information System (INIS)

    Taverdet-Popiolek, N.

    2009-01-01

    Futurology above all depends on having the right mind set. Gaston Berger summarizes the prospective approach in 5 five main thrusts: prepare for the distant future, be open-minded (have a systems and multidisciplinary approach), carry out in-depth analyzes (draw out actors which are really determinant or the future, as well as established shed trends), take risks (imagine risky but flexible projects) and finally think about humanity, futurology being a technique at the service of man to help him build a desirable future. On the other hand, forecasting is based on quantified models so as to deduce 'conclusions' about the future. In the field of energy, models are used to draw up scenarios which allow, for instance, measuring medium or long term effects of energy policies on greenhouse gas emissions or global welfare. Scenarios are shaped by the model's inputs (parameters, sets of assumptions) and outputs. Resorting to a model or projecting by scenario is useful in a prospective approach as it ensures coherence for most of the variables that have been identified through systems analysis and that the mind on its own has difficulty to grasp. Interpretation of each scenario must be carried out in the light o the underlying framework of assumptions (the backdrop), developed during the prospective stage. When the horizon is far away (very long-term), the worlds imagined by the futurologist contain breaks (technological, behavioural and organizational) which are hard to integrate into the models. It is here that the main limit for the use of models in futurology is located. (author)

  1. Improved Mental Acuity Forecasting with an Individualized Quantitative Sleep Model

    Directory of Open Access Journals (Sweden)

    Brent D. Winslow

    2017-04-01

    Full Text Available Sleep impairment significantly alters human brain structure and cognitive function, but available evidence suggests that adults in developed nations are sleeping less. A growing body of research has sought to use sleep to forecast cognitive performance by modeling the relationship between the two, but has generally focused on vigilance rather than other cognitive constructs affected by sleep, such as reaction time, executive function, and working memory. Previous modeling efforts have also utilized subjective, self-reported sleep durations and were restricted to laboratory environments. In the current effort, we addressed these limitations by employing wearable systems and mobile applications to gather objective sleep information, assess multi-construct cognitive performance, and model/predict changes to mental acuity. Thirty participants were recruited for participation in the study, which lasted 1 week. Using the Fitbit Charge HR and a mobile version of the automated neuropsychological assessment metric called CogGauge, we gathered a series of features and utilized the unified model of performance to predict mental acuity based on sleep records. Our results suggest that individuals poorly rate their sleep duration, supporting the need for objective sleep metrics to model circadian changes to mental acuity. Participant compliance in using the wearable throughout the week and responding to the CogGauge assessments was 80%. Specific biases were identified in temporal metrics across mobile devices and operating systems and were excluded from the mental acuity metric development. Individualized prediction of mental acuity consistently outperformed group modeling. This effort indicates the feasibility of creating an individualized, mobile assessment and prediction of mental acuity, compatible with the majority of current mobile devices.

  2. First principles pharmacokinetic modeling: A quantitative study on Cyclosporin

    DEFF Research Database (Denmark)

    Mošat', Andrej; Lueshen, Eric; Heitzig, Martina

    2013-01-01

    renal and hepatic clearances, elimination half-life, and mass transfer coefficients, to establish drug biodistribution dynamics in all organs and tissues. This multi-scale model satisfies first principles and conservation of mass, species and momentum.Prediction of organ drug bioaccumulation...... as a function of cardiac output, physiology, pathology or administration route may be possible with the proposed PBPK framework. Successful application of our model-based drug development method may lead to more efficient preclinical trials, accelerated knowledge gain from animal experiments, and shortened time-to-market...

  3. On the usability of quantitative modelling in operations strategy decission making

    NARCIS (Netherlands)

    Akkermans, H.A.; Bertrand, J.W.M.

    1997-01-01

    Quantitative modelling seems admirably suited to help managers in their strategic decision making on operations management issues, but in practice models are rarely used for this purpose. Investigates the reasons why, based on a detailed cross-case analysis of six cases of modelling-supported

  4. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  6. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  7. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  8. Quantitative experimental modelling of fragmentation during explosive volcanism

    Science.gov (United States)

    Thordén Haug, Ø.; Galland, O.; Gisler, G.

    2012-04-01

    Phreatomagmatic eruptions results from the violent interaction between magma and an external source of water, such as ground water or a lake. This interaction causes fragmentation of the magma and/or the host rock, resulting in coarse-grained (lapilli) to very fine-grained (ash) material. The products of phreatomagmatic explosions are classically described by their fragment size distribution, which commonly follows power laws of exponent D. Such descriptive approach, however, considers the final products only and do not provide information on the dynamics of fragmentation. The aim of this contribution is thus to address the following fundamental questions. What are the physics that govern fragmentation processes? How fragmentation occurs through time? What are the mechanisms that produce power law fragment size distributions? And what are the scaling laws that control the exponent D? To address these questions, we performed a quantitative experimental study. The setup consists of a Hele-Shaw cell filled with a layer of cohesive silica flour, at the base of which a pulse of pressurized air is injected, leading to fragmentation of the layer of flour. The fragmentation process is monitored through time using a high-speed camera. By varying systematically the air pressure (P) and the thickness of the flour layer (h) we observed two morphologies of fragmentation: "lift off" where the silica flour above the injection inlet is ejected upwards, and "channeling" where the air pierces through the layer along sub-vertical conduit. By building a phase diagram, we show that the morphology is controlled by P/dgh, where d is the density of the flour and g is the gravitational acceleration. To quantify the fragmentation process, we developed a Matlab image analysis program, which calculates the number and sizes of the fragments, and so the fragment size distribution, during the experiments. The fragment size distributions are in general described by power law distributions of

  9. Quantitative Comparison Between Crowd Models for Evacuation Planning and Evaluation

    NARCIS (Netherlands)

    Viswanathan, V.; Lee, C.E.; Lees, M.H.; Cheong, S.A.; Sloot, P.M.A.

    2014-01-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we

  10. Quantitative Research: A Dispute Resolution Model for FTC Advertising Regulation.

    Science.gov (United States)

    Richards, Jef I.; Preston, Ivan L.

    Noting the lack of a dispute mechanism for determining whether an advertising practice is truly deceptive without generating the costs and negative publicity produced by traditional Federal Trade Commission (FTC) procedures, this paper proposes a model based upon early termination of the issues through jointly commissioned behavioral research. The…

  11. A universal calculation model for the controlled electric transmission line

    International Nuclear Information System (INIS)

    Zivzivadze, O.; Zivzivadze, L.

    2009-01-01

    Difficulties associated with the development of calculation models are analyzed, and the ways of resolution of these problems are given. A version of the equivalent circuit as a six-pole network, the parameters of which do not depend on the angle of shift Θ between the voltage vectors of circuits is offered. The interrelation between the parameters of the equivalent circuit and the transmission constants of the line was determined. A universal calculation model for the controlled electric transmission line was elaborated. The model allows calculating the stationary modes of lines of such classes at any angle of shift Θ between the circuits. (author)

  12. Universal amplitude ratios in the 3D Ising model

    International Nuclear Information System (INIS)

    Caselle, M.; Hasenbusch, M.

    1998-01-01

    We present a high precision Monte Carlo study of various universal amplitude ratios of the three dimensional Ising spin model. Using state of the art simulation techniques we studied the model close to criticality in both phases. Great care was taken to control systematic errors due to finite size effects and correction to scaling terms. We obtain C + /C - =4.75(3), f +,2nd /f -,2nd =1.95(2) and u * =14.3(1). Our results are compatible with those obtained by field theoretic methods applied to the φ 4 theory and high and low temperature series expansions of the Ising model. (orig.)

  13. Fuzzy Universal Model Approximator for Distributed Solar Collector Field Control

    KAUST Repository

    Elmetennani, Shahrazed

    2014-07-01

    This paper deals with the control of concentrating parabolic solar collectors by forcing the outlet oil temperature to track a set reference. A fuzzy universal approximate model is introduced in order to accurately reproduce the behavior of the system dynamics. The proposed model is a low order state space representation derived from the partial differential equation describing the oil temperature evolution using fuzzy transform theory. The resulting set of ordinary differential equations simplifies the system analysis and the control law design and is suitable for real time control implementation. Simulation results show good performance of the proposed model.

  14. Quantitative properties of clustering within modern microscopic nuclear models

    International Nuclear Information System (INIS)

    Volya, A.; Tchuvil’sky, Yu. M.

    2016-01-01

    A method for studying cluster spectroscopic properties of nuclear fragmentation, such as spectroscopic amplitudes, cluster form factors, and spectroscopic factors, is developed on the basis of modern precision nuclear models that take into account the mixing of large-scale shell-model configurations. Alpha-cluster channels are considered as an example. A mathematical proof of the need for taking into account the channel-wave-function renormalization generated by exchange terms of the antisymmetrization operator (Fliessbach effect) is given. Examples where this effect is confirmed by a high quality of the description of experimental data are presented. By and large, the method in question extends substantially the possibilities for studying clustering phenomena in nuclei and for improving the quality of their description.

  15. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  16. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  17. Quantitative modeling of selective lysosomal targeting for drug design

    DEFF Research Database (Denmark)

    Trapp, Stefan; Rosania, G.; Horobin, R.W.

    2008-01-01

    log K ow. These findings were validated with experimental results and by a comparison to the properties of antimalarial drugs in clinical use. For ten active compounds, nine were predicted to accumulate to a greater extent in lysosomes than in other organelles, six of these were in the optimum range...... predicted by the model and three were close. Five of the antimalarial drugs were lipophilic weak dibasic compounds. The predicted optimum properties for a selective accumulation of weak bivalent bases in lysosomes are consistent with experimental values and are more accurate than any prior calculation...

  18. Cosmological models - in which universe do we live

    International Nuclear Information System (INIS)

    Hartvigsen, Y.

    1976-01-01

    A general discussion of the present state of cosmological models is introduced with a brief presentation of the expanding universe theory, the red shift and Hubble's Law. Hubble's Constant lies between 30 and 105 km/sec/Mpc, and a value of 55 km/sec/Mpc is assumed in this article. The arguments for the big bang and steady state theories are presented and the reasons for the present acceptance of the former given. Friedmann models are briefly discussed and 'universe density', rho, and 'space curvature',k, and the 'cosmological constant', Λ, are presented. These are shown on the Stabell-Refsdal diagram and the density parameter, sigma 0 , and the retardation parameter, q 0 , are related to Hubble's Constant. These parameters are then discussed and their values restricted such that the part of the Stabell-Refsdal diagram which is of interest may be defined. (JIW)

  19. Establishing a Business Process Reference Model for Universities

    KAUST Repository

    Svensson, Carsten

    2012-09-01

    Modern universities are by any standard complex organizations that, from an IT perspective, present a number of unique challenges. This paper will propose establishing a business process reference framework. The benefit to the users would be a better understanding of the system landscape, business process enablement, collection of performance data and systematic reuse of existing community experience and knowledge. For these reasons reference models such as the SCOR (Supply Chain Operations Reference), DCOR (Design Chain Operations Reference) and ITIL (Information Technology Infrastructure Library) have gained popularity among organizations in both the private and public sectors. We speculate that this success can be replicated in a university setting. Furthermore the paper will outline how the research group suggests moving ahead with the research which will lead to a reference model.

  20. The Effect of Entrepreneurship Education on Entrepreneurial Intention of University Students By Adopting Linan Model

    Directory of Open Access Journals (Sweden)

    Yud Buana

    2017-05-01

    Full Text Available The success of entrepreneurship education programs remains unanswered if it is associated with some students who have decided to launch and pursue a business venture. It is important to know the intentions of a nascent entrepreneur to start up the business ventures persistently if experts and policy makers’ attentions are drawn on how to arouse interest in starting a business. Quantitative approached was used in this research to examine the influence of entrepreneurship education, social norms and self-efficacy on intentions to pursue business ventures by adopting Linan model of intention-behavior. The model was addressed to the students who participated in entrepreneurship education program during the mid of study in Bina Nusantara University. Last, the result is in line with Linan model.

  1. Explanation of model design and talent management system in universities

    OpenAIRE

    AH Nazaripour; SNJ Mosavi; M Hakak; A Pirzad

    2017-01-01

    Abstract Background and aim: Nowadays talented human resources are considerd as the most important and valuable organizational asset. Proper management of these major asset, the the most essential task manager and the progress of any organization in this field is fierce competition with competitor. The aim of this study was to develop a model system for talent management in universities in the country. Methods: In this study the population was composed of 10 Azad Univers...

  2. Uniform relativistic universe models with pressure. Part 2. Observational tests

    International Nuclear Information System (INIS)

    Krempec, J.; Krygier, B.

    1977-01-01

    The magnitude-redshift and angular diameter-redshift relations are discussed for the uniform (homogeneous and isotropic) relativistic Universe models with pressure. The inclusion of pressure into the energy-momentum tensor has given larger values of the deceleration parameter q. An increase of the deceleration parameter has led to the brightening of objects as well as to a little larger angular diameters. (author)

  3. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative-quantitative modeling.

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-05-01

    Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if-then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLab(TM)-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/

  4. ADMIT: a toolbox for guaranteed model invalidation, estimation and qualitative–quantitative modeling

    Science.gov (United States)

    Streif, Stefan; Savchenko, Anton; Rumschinski, Philipp; Borchers, Steffen; Findeisen, Rolf

    2012-01-01

    Summary: Often competing hypotheses for biochemical networks exist in the form of different mathematical models with unknown parameters. Considering available experimental data, it is then desired to reject model hypotheses that are inconsistent with the data, or to estimate the unknown parameters. However, these tasks are complicated because experimental data are typically sparse, uncertain, and are frequently only available in form of qualitative if–then observations. ADMIT (Analysis, Design and Model Invalidation Toolbox) is a MatLabTM-based tool for guaranteed model invalidation, state and parameter estimation. The toolbox allows the integration of quantitative measurement data, a priori knowledge of parameters and states, and qualitative information on the dynamic or steady-state behavior. A constraint satisfaction problem is automatically generated and algorithms are implemented for solving the desired estimation, invalidation or analysis tasks. The implemented methods built on convex relaxation and optimization and therefore provide guaranteed estimation results and certificates for invalidity. Availability: ADMIT, tutorials and illustrative examples are available free of charge for non-commercial use at http://ifatwww.et.uni-magdeburg.de/syst/ADMIT/ Contact: stefan.streif@ovgu.de PMID:22451270

  5. A quantitative and dynamic model for plant stem cell regulation.

    Directory of Open Access Journals (Sweden)

    Florian Geier

    Full Text Available Plants maintain pools of totipotent stem cells throughout their entire life. These stem cells are embedded within specialized tissues called meristems, which form the growing points of the organism. The shoot apical meristem of the reference plant Arabidopsis thaliana is subdivided into several distinct domains, which execute diverse biological functions, such as tissue organization, cell-proliferation and differentiation. The number of cells required for growth and organ formation changes over the course of a plants life, while the structure of the meristem remains remarkably constant. Thus, regulatory systems must be in place, which allow for an adaptation of cell proliferation within the shoot apical meristem, while maintaining the organization at the tissue level. To advance our understanding of this dynamic tissue behavior, we measured domain sizes as well as cell division rates of the shoot apical meristem under various environmental conditions, which cause adaptations in meristem size. Based on our results we developed a mathematical model to explain the observed changes by a cell pool size dependent regulation of cell proliferation and differentiation, which is able to correctly predict CLV3 and WUS over-expression phenotypes. While the model shows stem cell homeostasis under constant growth conditions, it predicts a variation in stem cell number under changing conditions. Consistent with our experimental data this behavior is correlated with variations in cell proliferation. Therefore, we investigate different signaling mechanisms, which could stabilize stem cell number despite variations in cell proliferation. Our results shed light onto the dynamic constraints of stem cell pool maintenance in the shoot apical meristem of Arabidopsis in different environmental conditions and developmental states.

  6. Universal Regularizers For Robust Sparse Coding and Modeling

    OpenAIRE

    Ramirez, Ignacio; Sapiro, Guillermo

    2010-01-01

    Sparse data models, where data is assumed to be well represented as a linear combination of a few elements from a dictionary, have gained considerable attention in recent years, and their use has led to state-of-the-art results in many signal and image processing tasks. It is now well understood that the choice of the sparsity regularization term is critical in the success of such models. Based on a codelength minimization interpretation of sparse coding, and using tools from universal coding...

  7. Geometrothermodynamic model for the evolution of the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Gruber, Christine; Quevedo, Hernando, E-mail: christine.gruber@correo.nucleares.unam.mx, E-mail: quevedo@nucleares.unam.mx [Instituto de Ciencias Nucleares, Universidad Nacional Autónoma de México, AP 70543, México, DF 04510 (Mexico)

    2017-07-01

    Using the formalism of geometrothermodynamics to derive a fundamental thermodynamic equation, we construct a cosmological model in the framework of relativistic cosmology. In a first step, we describe a system without thermodynamic interaction, and show it to be equivalent to the standard ΛCDM paradigm. The second step includes thermodynamic interaction and produces a model consistent with the main features of inflation. With the proposed fundamental equation we are thus able to describe all the known epochs in the evolution of our Universe, starting from the inflationary phase.

  8. Roles of University Support for International Students in the United States: Analysis of a Systematic Model of University Identification, University Support, and Psychological Well-Being

    Science.gov (United States)

    Cho, Jaehee; Yu, Hongsik

    2015-01-01

    Unlike previous research on international students' social support, this current study applied the concept of organizational support to university contexts, examining the effects of university support. Mainly based on the social identity/self-categorization stress model, this study developed and tested a path model composed of four key…

  9. A Model for the Expansion of the Universe

    Directory of Open Access Journals (Sweden)

    Silva N. P.

    2014-04-01

    Full Text Available One introduces an ansatz for the expansion factor a ( t = e ( H ( t t or our Universe in the spirit of the FLRW model; is a constant to be determined. Considering that the ingredients acting on the Universe expansion ( t > 4 10 12 s 1 : 3 10 are mainly matter (baryons plus dark matter and dark energy, one uses the current mea- sured values of Hubble constant H 0 , the Universe current age T 0 , matter density param- eter Ω m ( T 0 and dark energy parameter Ω ( T 0 together with the Friedmann equations to find = 0 : 5804 and that our Universe may have had a negative expansion accelera- tion up to the age T ⋆ = 3 : 214 G y r ( matter era and positive after that ( dark energy era , leading to an eternal expansion. An interaction between matter and dark energy is found to exist. The deceleration q ( t has been found to be q ( T ⋆ = 0 and q ( T 0 = -0.570.

  10. Mapping quantitative trait loci in a selectively genotyped outbred population using a mixture model approach

    NARCIS (Netherlands)

    Johnson, David L.; Jansen, Ritsert C.; Arendonk, Johan A.M. van

    1999-01-01

    A mixture model approach is employed for the mapping of quantitative trait loci (QTL) for the situation where individuals, in an outbred population, are selectively genotyped. Maximum likelihood estimation of model parameters is obtained from an Expectation-Maximization (EM) algorithm facilitated by

  11. Quantitative modelling of HDPE spurt experiments using wall slip and generalised Newtonian flow

    NARCIS (Netherlands)

    Doelder, den C.F.J.; Koopmans, R.J.; Molenaar, J.

    1998-01-01

    A quantitative model to describe capillary rheometer experiments is presented. The model can generate ‘two-branched' discontinuous flow curves and the associated pressure oscillations. Polymer compressibility in the barrel, incompressible axisymmetric generalised Newtonian flow in the die, and a

  12. A new economic model for resource industries-implications for universities

    International Nuclear Information System (INIS)

    Romig, P.R.

    1993-01-01

    The upheaval in the US petroleum industry has had repercussions in the university community. Geoscience enrollments have plummeted, financial support has declined, and there are rumors that some programs have reduced mathematical rigor to maintain enrollment. While the adverse affects have been widespread, there is disagreement about implications and expectations for the future. Some argue that emphasis on short-term profitability produces ill-conceived, precipitous reactions which perpetuate the turmoil. Others respond that the resource and environmental needs of a burgeoning global population will ensure long-term growth. Both arguments miss the point. The fundamental economic structure of the industry is changing from revenue-driven to marginal-return. In marginal-return industries, investments depend on quantitative assessments of risk and return, and the use of interdisciplinary teams is the norm. University programs must educate students in engineering design and structured decision-making processes, develop integrated numeric models and create infrastructures that support multidisciplinary collaboration. Educational programs must begin teaching principles of engineering design and structured decision-making, with increased emphasis on outreach to the experienced employee. Meeting those needs will require closer collaboration between industry and the universities. Universities that are successful will reap a fringe benefit; their graduate will be better-qualified to be leaders in the environmentally geoscience field, which one day may be bigger than the oil industry

  13. Impact of implementation choices on quantitative predictions of cell-based computational models

    Science.gov (United States)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  14. A history of the universe in a superstring model

    International Nuclear Information System (INIS)

    Maeda, K.

    1986-07-01

    A superstring theory, which is most promising candidate for a unified theory, predicts a higher-dimensional 'space-time'. Its application to cosmology, especially reconsideration of the early history of the universe, is definitely important and interesting. Here, we discuss some scenario of the universe in a superstring model. Main problems in higher-dimensional unified theories, from the cosmological point of view, are: (i) Can the 4-dim Einstein gravity be obtained, rather than the Jordan-Brans-Dicke theory? (ii) Can the 4-dim Friedmann universe (F 4 ) be realized naturally in the higher-dimensional space-time? (iii) Does inflation really occur? The answers for (i) and (ii) are 'yes' in a superstring model, as we will see soon. (iii) is still an open question, although it seems to be difficult. Taking into account a quantum tunnelling effect of the anti-symmetric tensor field H μυρ , we also show that a hierarchical bubble structure might be formed due to a series of phase transitions

  15. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  16. Universality of correlation functions in random matrix models of QCD

    International Nuclear Information System (INIS)

    Jackson, A.D.; Sener, M.K.; Verbaarschot, J.J.M.

    1997-01-01

    We demonstrate the universality of the spectral correlation functions of a QCD inspired random matrix model that consists of a random part having the chiral structure of the QCD Dirac operator and a deterministic part which describes a schematic temperature dependence. We calculate the correlation functions analytically using the technique of Itzykson-Zuber integrals for arbitrary complex supermatrices. An alternative exact calculation for arbitrary matrix size is given for the special case of zero temperature, and we reproduce the well-known Laguerre kernel. At finite temperature, the microscopic limit of the correlation functions are calculated in the saddle-point approximation. The main result of this paper is that the microscopic universality of correlation functions is maintained even though unitary invariance is broken by the addition of a deterministic matrix to the ensemble. (orig.)

  17. [Quantitative estimation of vegetation cover and management factor in USLE and RUSLE models by using remote sensing data: a review].

    Science.gov (United States)

    Wu, Chang-Guang; Li, Sheng; Ren, Hua-Dong; Yao, Xiao-Hua; Huang, Zi-Jie

    2012-06-01

    Soil loss prediction models such as universal soil loss equation (USLE) and its revised universal soil loss equation (RUSLE) are the useful tools for risk assessment of soil erosion and planning of soil conservation at regional scale. To make a rational estimation of vegetation cover and management factor, the most important parameters in USLE or RUSLE, is particularly important for the accurate prediction of soil erosion. The traditional estimation based on field survey and measurement is time-consuming, laborious, and costly, and cannot rapidly extract the vegetation cover and management factor at macro-scale. In recent years, the development of remote sensing technology has provided both data and methods for the estimation of vegetation cover and management factor over broad geographic areas. This paper summarized the research findings on the quantitative estimation of vegetation cover and management factor by using remote sensing data, and analyzed the advantages and the disadvantages of various methods, aimed to provide reference for the further research and quantitative estimation of vegetation cover and management factor at large scale.

  18. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  19. Universal autoignition models for designer fuels in HCCI combustion

    Energy Technology Data Exchange (ETDEWEB)

    Vandersickel, A.; Boulouchos, K.; Wright, Y.M. [LAV - Aerothermochemistry and Combustion Systems Laboratory - Institute of Energy Technology, ETH Zurich (Switzerland)], email: vandersickel@lav.mavt.ethz.ch

    2010-07-01

    In the energy sector, stringent regulations have been implemented on combustion emissions in order to address health and environmental concerns and help improve air quality. A novel combustion mode, homogeneous charge compression ignition (HCCI), can improve the emissions performance of an engine in terms of NOx and soot release over that of diesel while maintaining the same efficiencies. However, problems of ignition timing control arise with HCCI. The aim of this paper is to determine how fuel properties impact the HCCI ignition process and operating range. This study was carried out as part of a collaboration among several universities and automotive companies and 10 fuels were investigated experimentally and numerically using Arrhenius' model and a lumped reaction model. The two ignition models were successfully adapted to describe the behavior of the studied fuels; atomizer engine experiments validated their results. Further work will be conducted to optimize the reaction mechanism for the remaining process fuels.

  20. 3D vs 2D laparoscopic systems: Development of a performance quantitative validation model.

    Science.gov (United States)

    Ghedi, Andrea; Donarini, Erica; Lamera, Roberta; Sgroi, Giovanni; Turati, Luca; Ercole, Cesare

    2015-01-01

    The new technology ensures 3D laparoscopic vision by adding depth to the traditional two dimensions. This realistic vision gives the surgeon the feeling of operating in real space. Hospital of Treviglio-Caravaggio isn't an university or scientific institution; in 2014 a new 3D laparoscopic technology was acquired therefore it led to evaluation of the of the appropriateness in term of patient outcome and safety. The project aims at achieving the development of a quantitative validation model that would ensure low cost and a reliable measure of the performance of 3D technology versus 2D mode. In addition, it aims at demonstrating how new technologies, such as open source hardware and software and 3D printing, could help research with no significant cost increase. For these reasons, in order to define criteria of appropriateness in the use of 3D technologies, it was decided to perform a study to technically validate the use of the best technology in terms of effectiveness, efficiency and safety in the use of a system between laparoscopic vision in 3D and the traditional 2D. 30 surgeons were enrolled in order to perform an exercise through the use of laparoscopic forceps inside a trainer. The exercise consisted of having surgeons with different level of seniority, grouped by type of specialization (eg. surgery, urology, gynecology), exercising videolaparoscopy with two technologies (2D and 3D) through the use of a anthropometric phantom. The target assigned to the surgeon was that to pass "needle and thread" without touching the metal part in the shortest time possible. The rings selected for the exercise had each a coefficient of difficulty determined by depth, diameter, angle from the positioning and from the point of view. The analysis of the data collected from the above exercise has mathematically confirmed that the 3D technique ensures a learning curve lower in novice and greater accuracy in the performance of the task with respect to 2D.

  1. Universality in random-walk models with birth and death

    International Nuclear Information System (INIS)

    Bender, C.M.; Boettcher, S.; Meisinger, P.N.

    1995-01-01

    Models of random walks are considered in which walkers are born at one site and die at all other sites. Steady-state distributions of walkers exhibit dimensionally dependent critical behavior as a function of the birth rate. Exact analytical results for a hyperspherical lattice yield a second-order phase transition with a nontrivial critical exponent for all positive dimensions D≠2, 4. Numerical studies of hypercubic and fractal lattices indicate that these exact results are universal. This work elucidates the adsorption transition of polymers at curved interfaces. copyright 1995 The American Physical Society

  2. Organizational Models and Mythologies of the American Research University. ASHE 1986 Annual Meeting Paper.

    Science.gov (United States)

    Alpert, Daniel

    Features of the matrix model of the research university and myths about the academic enterprise are described, along with serious dissonances in the U.S. university system. The linear model, from which the matrix model evolved, describes the university's structure, perceived mission, and organizational behavior. A matrix model portrays in concise,…

  3. Anisotropic, nonsingular early universe model leading to a realistic cosmology

    International Nuclear Information System (INIS)

    Dechant, Pierre-Philippe; Lasenby, Anthony N.; Hobson, Michael P.

    2009-01-01

    We present a novel cosmological model in which scalar field matter in a biaxial Bianchi IX geometry leads to a nonsingular 'pancaking' solution: the hypersurface volume goes to zero instantaneously at the 'big bang', but all physical quantities, such as curvature invariants and the matter energy density remain finite, and continue smoothly through the big bang. We demonstrate that there exist geodesics extending through the big bang, but that there are also incomplete geodesics that spiral infinitely around a topologically closed spatial dimension at the big bang, rendering it, at worst, a quasiregular singularity. The model is thus reminiscent of the Taub-NUT vacuum solution in that it has biaxial Bianchi IX geometry and its evolution exhibits a dimensionality reduction at a quasiregular singularity; the two models are, however, rather different, as we will show in a future work. Here we concentrate on the cosmological implications of our model and show how the scalar field drives both isotropization and inflation, thus raising the question of whether structure on the largest scales was laid down at a time when the universe was still oblate (as also suggested by [T. S. Pereira, C. Pitrou, and J.-P. Uzan, J. Cosmol. Astropart. Phys. 9 (2007) 6.][C. Pitrou, T. S. Pereira, and J.-P. Uzan, J. Cosmol. Astropart. Phys. 4 (2008) 4.][A. Guemruekcueoglu, C. Contaldi, and M. Peloso, J. Cosmol. Astropart. Phys. 11 (2007) 005.]). We also discuss the stability of our model to small perturbations around biaxiality and draw an analogy with cosmological perturbations. We conclude by presenting a separate, bouncing solution, which generalizes the known bouncing solution in closed FRW universes.

  4. Completeness of classical spin models and universal quantum computation

    International Nuclear Information System (INIS)

    De las Cuevas, Gemma; Dür, Wolfgang; Briegel, Hans J; Van den Nest, Maarten

    2009-01-01

    We study mappings between different classical spin systems that leave the partition function invariant. As recently shown in Van den Nest et al (2008 Phys. Rev. Lett. 100 110501), the partition function of the 2D square lattice Ising model in the presence of an inhomogeneous magnetic field can specialize to the partition function of any Ising system on an arbitrary graph. In this sense the 2D Ising model is said to be 'complete'. However, in order to obtain the above result, the coupling strengths on the 2D lattice must assume complex values, and thus do not allow for a physical interpretation. Here we show how a complete model with real—and, hence, 'physical'—couplings can be obtained if the 3D Ising model is considered. We furthermore show how to map general q-state systems with possibly many-body interactions to the 2D Ising model with complex parameters, and give completeness results for these models with real parameters. We also demonstrate that the computational overhead in these constructions is in all relevant cases polynomial. These results are proved by invoking a recently found cross-connection between statistical mechanics and quantum information theory, where partition functions are expressed as quantum mechanical amplitudes. Within this framework, there exists a natural correspondence between many-body quantum states that allow for universal quantum computation via local measurements only, and complete classical spin systems

  5. A quantitative model of the cardiac ventricular cell incorporating the transverse-axial tubular system

    Czech Academy of Sciences Publication Activity Database

    Pásek, Michal; Christé, G.; Šimurda, J.

    2003-01-01

    Roč. 22, č. 3 (2003), s. 355-368 ISSN 0231-5882 R&D Projects: GA ČR GP204/02/D129 Institutional research plan: CEZ:AV0Z2076919 Keywords : cardiac cell * tubular system * quantitative modelling Subject RIV: BO - Biophysics Impact factor: 0.794, year: 2003

  6. Systematic Analysis of Quantitative Logic Model Ensembles Predicts Drug Combination Effects on Cell Signaling Networks

    Science.gov (United States)

    2016-08-27

    bovine serum albumin (BSA) diluted to the amount corresponding to that in the media of the stimulated cells. Phospho-JNK comprises two isoforms whose...information accompanies this paper on the CPT: Pharmacometrics & Systems Pharmacology website (http://www.wileyonlinelibrary.com/psp4) Systematic Analysis of Quantitative Logic Model Morris et al. 553 www.wileyonlinelibrary/psp4

  7. Preference Mining Using Neighborhood Rough Set Model on Two Universes.

    Science.gov (United States)

    Zeng, Kai

    2016-01-01

    Preference mining plays an important role in e-commerce and video websites for enhancing user satisfaction and loyalty. Some classical methods are not available for the cold-start problem when the user or the item is new. In this paper, we propose a new model, called parametric neighborhood rough set on two universes (NRSTU), to describe the user and item data structures. Furthermore, the neighborhood lower approximation operator is used for defining the preference rules. Then, we provide the means for recommending items to users by using these rules. Finally, we give an experimental example to show the details of NRSTU-based preference mining for cold-start problem. The parameters of the model are also discussed. The experimental results show that the proposed method presents an effective solution for preference mining. In particular, NRSTU improves the recommendation accuracy by about 19% compared to the traditional method.

  8. Comparison of blood flow models and acquisitions for quantitative myocardial perfusion estimation from dynamic CT

    International Nuclear Information System (INIS)

    Bindschadler, Michael; Alessio, Adam M; Modgil, Dimple; La Riviere, Patrick J; Branch, Kelley R

    2014-01-01

    Myocardial blood flow (MBF) can be estimated from dynamic contrast enhanced (DCE) cardiac CT acquisitions, leading to quantitative assessment of regional perfusion. The need for low radiation dose and the lack of consensus on MBF estimation methods motivates this study to refine the selection of acquisition protocols and models for CT-derived MBF. DCE cardiac CT acquisitions were simulated for a range of flow states (MBF = 0.5, 1, 2, 3 ml (min g) −1 , cardiac output = 3, 5, 8 L min −1 ). Patient kinetics were generated by a mathematical model of iodine exchange incorporating numerous physiological features including heterogenenous microvascular flow, permeability and capillary contrast gradients. CT acquisitions were simulated for multiple realizations of realistic x-ray flux levels. CT acquisitions that reduce radiation exposure were implemented by varying both temporal sampling (1, 2, and 3 s sampling intervals) and tube currents (140, 70, and 25 mAs). For all acquisitions, we compared three quantitative MBF estimation methods (two-compartment model, an axially-distributed model, and the adiabatic approximation to the tissue homogeneous model) and a qualitative slope-based method. In total, over 11 000 time attenuation curves were used to evaluate MBF estimation in multiple patient and imaging scenarios. After iodine-based beam hardening correction, the slope method consistently underestimated flow by on average 47.5% and the quantitative models provided estimates with less than 6.5% average bias and increasing variance with increasing dose reductions. The three quantitative models performed equally well, offering estimates with essentially identical root mean squared error (RMSE) for matched acquisitions. MBF estimates using the qualitative slope method were inferior in terms of bias and RMSE compared to the quantitative methods. MBF estimate error was equal at matched dose reductions for all quantitative methods and range of techniques evaluated. This

  9. Modeling universal dynamics of cell spreading on elastic substrates.

    Science.gov (United States)

    Fan, Houfu; Li, Shaofan

    2015-11-01

    A three-dimensional (3D) multiscale moving contact line model is combined with a soft matter cell model to study the universal dynamics of cell spreading over elastic substrates. We have studied both the early stage and the late stage cell spreading by taking into account the actin tension effect. In this work, the cell is modeled as an active nematic droplet, and the substrate is modeled as a St. Venant Kirchhoff elastic medium. A complete 3D simulation of cell spreading has been carried out. The simulation results show that the spreading area versus spreading time at different stages obeys specific power laws, which is in good agreement with experimental data and theoretical prediction reported in the literature. Moreover, the simulation results show that the substrate elasticity may affect force dipole distribution inside the cell. The advantage of this approach is that it combines the hydrodynamics of actin retrograde flow with moving contact line model so that it can naturally include actin tension effect resulting from actin polymerization and actomyosin contraction, and thus it might be capable of simulating complex cellular scale phenomenon, such as cell spreading or even crawling.

  10. A new approach to developing and optimizing organization strategy based on stochastic quantitative model of strategic performance

    Directory of Open Access Journals (Sweden)

    Marko Hell

    2014-03-01

    Full Text Available This paper presents a highly formalized approach to strategy formulation and optimization of strategic performance through proper resource allocation. A stochastic quantitative model of strategic performance (SQMSP is used to evaluate the efficiency of the strategy developed. The SQMSP follows the theoretical notions of the balanced scorecard (BSC and strategy map methodologies, initially developed by Kaplan and Norton. Parameters of the SQMSP are suggested to be random variables and be evaluated by experts who give two-point (optimistic and pessimistic values and three-point (optimistic, most probable and pessimistic values evaluations. The Monte-Carlo method is used to simulate strategic performance. Having been implemented within a computer application and applied to solve the real problem (planning of an IT-strategy at the Faculty of Economics, University of Split the proposed approach demonstrated its high potential as a basis for development of decision support tools related to strategic planning.

  11. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  12. Quantitative analysis of CT brain images: a statistical model incorporating partial volume and beam hardening effects

    International Nuclear Information System (INIS)

    McLoughlin, R.F.; Ryan, M.V.; Heuston, P.M.; McCoy, C.T.; Masterson, J.B.

    1992-01-01

    The purpose of this study was to construct and evaluate a statistical model for the quantitative analysis of computed tomographic brain images. Data were derived from standard sections in 34 normal studies. A model representing the intercranial pure tissue and partial volume areas, with allowance for beam hardening, was developed. The average percentage error in estimation of areas, derived from phantom tests using the model, was 28.47%. We conclude that our model is not sufficiently accurate to be of clinical use, even though allowance was made for partial volume and beam hardening effects. (author)

  13. A Universal Model of Giftedness--An Adaptation of the Munich Model

    Science.gov (United States)

    Jessurun, J. H.; Shearer, C. B.; Weggeman, M. C. D. P.

    2016-01-01

    The Munich Model of Giftedness (MMG) by Heller and his colleagues, developed for the identification of gifted children, is adapted and expanded, with the aim of making it more universally usable as a model for the pathway from talents to performance. On the side of the talent-factors, the concept of multiple intelligences is introduced, and the…

  14. Walking the Walk: Modeling Social Model and Universal Design in the Disabilities Office

    Science.gov (United States)

    Thornton, Melanie; Downs, Sharon

    2010-01-01

    Making the shift from the medical model of disability to the social model requires postsecondary disabilities offices to carefully examine and revise policies and procedures to reflect this paradigm shift, which gives them the credibility to work toward such change on the campus level. The process followed by one university is covered in-depth, as…

  15. Universe

    CERN Document Server

    2009-01-01

    The Universe, is one book in the Britannica Illustrated Science Library Series that is correlated to the science curriculum in grades 5-8. The Britannica Illustrated Science Library is a visually compelling set that covers earth science, life science, and physical science in 16 volumes.  Created for ages 10 and up, each volume provides an overview on a subject and thoroughly explains it through detailed and powerful graphics-more than 1,000 per volume-that turn complex subjects into information that students can grasp.  Each volume contains a glossary with full definitions for vocabulary help and an index.

  16. Discussions on the non-equilibrium effects in the quantitative phase field model of binary alloys

    International Nuclear Information System (INIS)

    Zhi-Jun, Wang; Jin-Cheng, Wang; Gen-Cang, Yang

    2010-01-01

    All the quantitative phase field models try to get rid of the artificial factors of solutal drag, interface diffusion and interface stretch in the diffuse interface. These artificial non-equilibrium effects due to the introducing of diffuse interface are analysed based on the thermodynamic status across the diffuse interface in the quantitative phase field model of binary alloys. Results indicate that the non-equilibrium effects are related to the negative driving force in the local region of solid side across the diffuse interface. The negative driving force results from the fact that the phase field model is derived from equilibrium condition but used to simulate the non-equilibrium solidification process. The interface thickness dependence of the non-equilibrium effects and its restriction on the large scale simulation are also discussed. (cross-disciplinary physics and related areas of science and technology)

  17. Development of quantitative atomic modeling for tungsten transport study Using LHD plasma with tungsten pellet injection

    International Nuclear Information System (INIS)

    Murakami, I.; Sakaue, H.A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2014-10-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from currentless plasmas of the Large Helical Device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) lines of W 24+ to W 33+ ions are very sensitive to electron temperature (Te) and useful to examine the tungsten behavior in edge plasmas. Based on the first quantitative analysis of measured spatial profile of W 44+ ion, the tungsten concentration is determined to be n(W 44+ )/n e = 1.4x10 -4 and the total radiation loss is estimated as ∼4 MW, of which the value is roughly half the total NBI power. (author)

  18. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  19. Early universe cosmology. In supersymmetric extensions of the standard model

    Energy Technology Data Exchange (ETDEWEB)

    Baumann, Jochen Peter

    2012-03-19

    In this thesis we investigate possible connections between cosmological inflation and leptogenesis on the one side and particle physics on the other side. We work in supersymmetric extensions of the Standard Model. A key role is played by the right-handed sneutrino, the superpartner of the right-handed neutrino involved in the type I seesaw mechanism. We study a combined model of inflation and non-thermal leptogenesis that is a simple extension of the Minimal Supersymmetric Standard Model (MSSM) with conserved R-parity, where we add three right-handed neutrino super fields. The inflaton direction is given by the imaginary components of the corresponding scalar component fields, which are protected from the supergravity (SUGRA) {eta}-problem by a shift symmetry in the Kaehler potential. We discuss the model first in a globally supersymmetric (SUSY) and then in a supergravity context and compute the inflationary predictions of the model. We also study reheating and non-thermal leptogenesis in this model. A numerical simulation shows that shortly after the waterfall phase transition that ends inflation, the universe is dominated by right-handed sneutrinos and their out-of-equilibrium decay can produce the desired matter-antimatter asymmetry. Using a simplified time-averaged description, we derive analytical expressions for the model predictions. Combining the results from inflation and leptogenesis allows us to constrain the allowed parameter space from two different directions, with implications for low energy neutrino physics. As a second thread of investigation, we discuss a generalisation of the inflationary model discussed above to include gauge non-singlet fields as inflatons. This is motivated by the fact that in left-right symmetric, supersymmetric Grand Unified Theories (SUSY GUTs), like SUSY Pati-Salam unification or SUSY SO(10) GUTs, the righthanded (s)neutrino is an indispensable ingredient and does not have to be put in by hand as in the MSSM. We discuss

  20. Early universe cosmology. In supersymmetric extensions of the standard model

    International Nuclear Information System (INIS)

    Baumann, Jochen Peter

    2012-01-01

    In this thesis we investigate possible connections between cosmological inflation and leptogenesis on the one side and particle physics on the other side. We work in supersymmetric extensions of the Standard Model. A key role is played by the right-handed sneutrino, the superpartner of the right-handed neutrino involved in the type I seesaw mechanism. We study a combined model of inflation and non-thermal leptogenesis that is a simple extension of the Minimal Supersymmetric Standard Model (MSSM) with conserved R-parity, where we add three right-handed neutrino super fields. The inflaton direction is given by the imaginary components of the corresponding scalar component fields, which are protected from the supergravity (SUGRA) η-problem by a shift symmetry in the Kaehler potential. We discuss the model first in a globally supersymmetric (SUSY) and then in a supergravity context and compute the inflationary predictions of the model. We also study reheating and non-thermal leptogenesis in this model. A numerical simulation shows that shortly after the waterfall phase transition that ends inflation, the universe is dominated by right-handed sneutrinos and their out-of-equilibrium decay can produce the desired matter-antimatter asymmetry. Using a simplified time-averaged description, we derive analytical expressions for the model predictions. Combining the results from inflation and leptogenesis allows us to constrain the allowed parameter space from two different directions, with implications for low energy neutrino physics. As a second thread of investigation, we discuss a generalisation of the inflationary model discussed above to include gauge non-singlet fields as inflatons. This is motivated by the fact that in left-right symmetric, supersymmetric Grand Unified Theories (SUSY GUTs), like SUSY Pati-Salam unification or SUSY SO(10) GUTs, the righthanded (s)neutrino is an indispensable ingredient and does not have to be put in by hand as in the MSSM. We discuss the

  1. Review of Education in Mathematics, Data Science and Quantitative Disciplines: Report to the Group of Eight Universities

    Science.gov (United States)

    Brown, Gavin

    2009-01-01

    The Reference Committee firmly shares the view that the state of the mathematical sciences and related quantitative disciplines in Australia has deteriorated to a dangerous level, and continues to deteriorate. Accordingly the author decided to structure this Report around a small number of recommendations, some long term and others to address…

  2. Tactile Architectural Models as Universal ‘Urban Furniture’

    Science.gov (United States)

    Kłopotowska, Agnieszka

    2017-10-01

    Tactile architectural models and maquettes have been built in the external public spaces of Polish cities since the latter half of the 00s of the 21st century. These objects are designed for the blind, but also other people - tourists, children, and those who arrive in wheelchairs. This collection has got currently more than 70 implements, which places Poland in the group of European leaders. Unfortunately, this “furniture”, is not always “convenient” and safe for all recipients. Studies, which have been conducted together with Maciej Kłopotowski since 2016 across the country, show a number of serious design and executive mistakes or examples of misuse. The purpose of this article is drawing attention to these issues and pointing out ways how they can be avoided. These objects may become completely valuable, universal tool for learning and a great way of studying architecture in an alternative way.

  3. Universal dS vacua in STU-models

    Energy Technology Data Exchange (ETDEWEB)

    Blåbäck, J. [Institut de Physique Théorique, Université Paris Saclay, CEA, CNRS,F-91191 Gif-sur-Yvette Cedex (France); Danielsson, UniversityH.; Dibitetto, G.; Vargas, S.C. [Institutionen för fysik och astronomi, University of Uppsala,Box 803, SE-751 08 Uppsala (Sweden)

    2015-10-09

    Stable de Sitter solutions in minimal F-term supergravity are known to lie close to Minkowski critical points. We consider a class of STU-models arising from type IIB compactifications with generalised fluxes. There, we apply an analytical method for solving the equations of motion for the moduli fields based on the idea of treating derivatives of the superpotential of different orders up to third as independent objects. In particular, supersymmetric and no-scale Minkowski solutions are singled out by physical reasons. Focusing on the study of dS vacua close to supersymmetric Minkowski points, we are able to elaborate a complete analytical treatment of the mass matrix based on the sGoldstino bound. This leads to a class of interesting universal dS vacua. We finally explore a similar possibility around no-scale Minkowski points and discuss some examples.

  4. A universal, fault-tolerant, non-linear analytic network for modeling and fault detection

    International Nuclear Information System (INIS)

    Mott, J.E.; King, R.W.; Monson, L.R.; Olson, D.L.; Staffon, J.D.

    1992-01-01

    The similarities and differences of a universal network to normal neural networks are outlined. The description and application of a universal network is discussed by showing how a simple linear system is modeled by normal techniques and by universal network techniques. A full implementation of the universal network as universal process modeling software on a dedicated computer system at EBR-II is described and example results are presented. It is concluded that the universal network provides different feature recognition capabilities than a neural network and that the universal network can provide extremely fast, accurate, and fault-tolerant estimation, validation, and replacement of signals in a real system

  5. A universal, fault-tolerant, non-linear analytic network for modeling and fault detection

    Energy Technology Data Exchange (ETDEWEB)

    Mott, J.E. [Advanced Modeling Techniques Corp., Idaho Falls, ID (United States); King, R.W.; Monson, L.R.; Olson, D.L.; Staffon, J.D. [Argonne National Lab., Idaho Falls, ID (United States)

    1992-03-06

    The similarities and differences of a universal network to normal neural networks are outlined. The description and application of a universal network is discussed by showing how a simple linear system is modeled by normal techniques and by universal network techniques. A full implementation of the universal network as universal process modeling software on a dedicated computer system at EBR-II is described and example results are presented. It is concluded that the universal network provides different feature recognition capabilities than a neural network and that the universal network can provide extremely fast, accurate, and fault-tolerant estimation, validation, and replacement of signals in a real system.

  6. Quantitative chemogenomics: machine-learning models of protein-ligand interaction.

    Science.gov (United States)

    Andersson, Claes R; Gustafsson, Mats G; Strömbergsson, Helena

    2011-01-01

    Chemogenomics is an emerging interdisciplinary field that lies in the interface of biology, chemistry, and informatics. Most of the currently used drugs are small molecules that interact with proteins. Understanding protein-ligand interaction is therefore central to drug discovery and design. In the subfield of chemogenomics known as proteochemometrics, protein-ligand-interaction models are induced from data matrices that consist of both protein and ligand information along with some experimentally measured variable. The two general aims of this quantitative multi-structure-property-relationship modeling (QMSPR) approach are to exploit sparse/incomplete information sources and to obtain more general models covering larger parts of the protein-ligand space, than traditional approaches that focuses mainly on specific targets or ligands. The data matrices, usually obtained from multiple sparse/incomplete sources, typically contain series of proteins and ligands together with quantitative information about their interactions. A useful model should ideally be easy to interpret and generalize well to new unseen protein-ligand combinations. Resolving this requires sophisticated machine-learning methods for model induction, combined with adequate validation. This review is intended to provide a guide to methods and data sources suitable for this kind of protein-ligand-interaction modeling. An overview of the modeling process is presented including data collection, protein and ligand descriptor computation, data preprocessing, machine-learning-model induction and validation. Concerns and issues specific for each step in this kind of data-driven modeling will be discussed. © 2011 Bentham Science Publishers

  7. The quark mass spectrum in the Universal Seesaw model

    International Nuclear Information System (INIS)

    Ranfone, S.

    1993-03-01

    In the context of a Universal Seesaw model implemented in a left-right symmetric theory, we show that, by allowing the two left-handed doublet Higgs fields to develop different vacuum-expectation-values (VEV's), it is possible to account for the observed structure of the quark mass spectrum without the need of any hierarchy among the Yukawa couplings. In this framework the top-quark mass is expected to be of the order of its present experimental lower bound, m t ≅ 90 to 100 GeV. Moreover, we find that, while one of the Higgs doublets gets essentially the standard model VEV of approximately 250 GeV, the second doublet is expected to have a much smaller VEV, of order 10 GeV. The identification of the large mass scale of the model with the Peccei-Quinn scale fixes the mass of the right-handed gauge bosons in the range 10 7 to 10 10 GeV, far beyond the reach of present collider experiments. (author)

  8. Quantitative Analysis of Intra Urban Growth Modeling using socio economic agents by combining cellular automata model with agent based model

    Science.gov (United States)

    Singh, V. K.; Jha, A. K.; Gupta, K.; Srivastav, S. K.

    2017-12-01

    Recent studies indicate that there is a significant improvement in the urban land use dynamics through modeling at finer spatial resolutions. Geo-computational models such as cellular automata and agent based model have given evident proof regarding the quantification of the urban growth pattern with urban boundary. In recent studies, socio- economic factors such as demography, education rate, household density, parcel price of the current year, distance to road, school, hospital, commercial centers and police station are considered to the major factors influencing the Land Use Land Cover (LULC) pattern of the city. These factors have unidirectional approach to land use pattern which makes it difficult to analyze the spatial aspects of model results both quantitatively and qualitatively. In this study, cellular automata model is combined with generic model known as Agent Based Model to evaluate the impact of socio economic factors on land use pattern. For this purpose, Dehradun an Indian city is selected as a case study. Socio economic factors were collected from field survey, Census of India, Directorate of economic census, Uttarakhand, India. A 3X3 simulating window is used to consider the impact on LULC. Cellular automata model results are examined for the identification of hot spot areas within the urban area and agent based model will be using logistic based regression approach where it will identify the correlation between each factor on LULC and classify the available area into low density, medium density, high density residential or commercial area. In the modeling phase, transition rule, neighborhood effect, cell change factors are used to improve the representation of built-up classes. Significant improvement is observed in the built-up classes from 84 % to 89 %. However after incorporating agent based model with cellular automata model the accuracy improved from 89 % to 94 % in 3 classes of urban i.e. low density, medium density and commercial classes

  9. An Elaboration of a Strategic Alignment Model of University Information Systems based on SAM Model

    Directory of Open Access Journals (Sweden)

    S. Ahriz

    2018-02-01

    Full Text Available Information system is a guarantee of the universities' ability to anticipate the essential functions to their development and durability. The alignment of information system, one of the pillars of IT governance, has become a necessity. In this paper, we consider the problem of strategic alignment model implementation in Moroccan universities. Literature revealed that few studies have examined strategic alignment in the public sector, particularly in higher education institutions. Hence we opted for an exploratory approach that aims to better understanding the strategic alignment and to evaluate the degree of its use within Moroccan universities. The data gained primarily through interviews with top managers and IT managers reveal that the alignment is not formalized and that it would be appropriate to implement an alignment model. It is found that the implementation of our proposed model can help managers to maximize returns of IT investment and to increase their efficiency.

  10. A quantitative analysis of instabilities in the linear chiral sigma model

    International Nuclear Information System (INIS)

    Nemes, M.C.; Nielsen, M.; Oliveira, M.M. de; Providencia, J. da

    1990-08-01

    We present a method to construct a complete set of stationary states corresponding to small amplitude motion which naturally includes the continuum solution. The energy wheighted sum rule (EWSR) is shown to provide for a quantitative criterium on the importance of instabilities which is known to occur in nonasymptotically free theories. Out results for the linear σ model showed be valid for a large class of models. A unified description of baryon and meson properties in terms of the linear σ model is also given. (author)

  11. A quantitative approach to modeling the information processing of NPP operators under input information overload

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task under input information overload. We primarily develop the information processing model having multiple stages, which contains information flow. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory. We also investigate the applicability of this approach to quantifying the information reduction of operators under the input information overload

  12. Parts of the Whole: Strategies for the Spread of Quantitative Literacy: What Models Can Tell Us

    Directory of Open Access Journals (Sweden)

    Dorothy Wallace

    2014-07-01

    Full Text Available Two conceptual frameworks, one from graph theory and one from dynamical systems, have been offered as explanations for complex phenomena in biology and also as possible models for the spread of ideas. The two models are based on different assumptions and thus predict quite different outcomes for the fate of either biological species or ideas. We argue that, depending on the culture in which they exist, one can identify which model is more likely to reflect the survival of two competing ideas. Based on this argument we suggest how two strategies for embedding and normalizing quantitative literacy in a given institution are likely to succeed or fail.

  13. A quantitative and dynamic model of the Arabidopsis flowering time gene regulatory network.

    Directory of Open Access Journals (Sweden)

    Felipe Leal Valentim

    Full Text Available Various environmental signals integrate into a network of floral regulatory genes leading to the final decision on when to flower. Although a wealth of qualitative knowledge is available on how flowering time genes regulate each other, only a few studies incorporated this knowledge into predictive models. Such models are invaluable as they enable to investigate how various types of inputs are combined to give a quantitative readout. To investigate the effect of gene expression disturbances on flowering time, we developed a dynamic model for the regulation of flowering time in Arabidopsis thaliana. Model parameters were estimated based on expression time-courses for relevant genes, and a consistent set of flowering times for plants of various genetic backgrounds. Validation was performed by predicting changes in expression level in mutant backgrounds and comparing these predictions with independent expression data, and by comparison of predicted and experimental flowering times for several double mutants. Remarkably, the model predicts that a disturbance in a particular gene has not necessarily the largest impact on directly connected genes. For example, the model predicts that SUPPRESSOR OF OVEREXPRESSION OF CONSTANS (SOC1 mutation has a larger impact on APETALA1 (AP1, which is not directly regulated by SOC1, compared to its effect on LEAFY (LFY which is under direct control of SOC1. This was confirmed by expression data. Another model prediction involves the importance of cooperativity in the regulation of APETALA1 (AP1 by LFY, a prediction supported by experimental evidence. Concluding, our model for flowering time gene regulation enables to address how different quantitative inputs are combined into one quantitative output, flowering time.

  14. Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.

    Science.gov (United States)

    Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia

    2013-10-02

    Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.

  15. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    Science.gov (United States)

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  16. Refining the statistical model for quantitative immunostaining of surface-functionalized nanoparticles by AFM.

    Science.gov (United States)

    MacCuspie, Robert I; Gorka, Danielle E

    2013-10-01

    Recently, an atomic force microscopy (AFM)-based approach for quantifying the number of biological molecules conjugated to a nanoparticle surface at low number densities was reported. The number of target molecules conjugated to the analyte nanoparticle can be determined with single nanoparticle fidelity using antibody-mediated self-assembly to decorate the analyte nanoparticles with probe nanoparticles (i.e., quantitative immunostaining). This work refines the statistical models used to quantitatively interpret the observations when AFM is used to image the resulting structures. The refinements add terms to the previous statistical models to account for the physical sizes of the analyte nanoparticles, conjugated molecules, antibodies, and probe nanoparticles. Thus, a more physically realistic statistical computation can be implemented for a given sample of known qualitative composition, using the software scripts provided. Example AFM data sets, using horseradish peroxidase conjugated to gold nanoparticles, are presented to illustrate how to implement this method successfully.

  17. Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model

    Science.gov (United States)

    Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi

    2017-09-01

    Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.

  18. Integration of CFD codes and advanced combustion models for quantitative burnout determination

    Energy Technology Data Exchange (ETDEWEB)

    Javier Pallares; Inmaculada Arauzo; Alan Williams [University of Zaragoza, Zaragoza (Spain). Centre of Research for Energy Resources and Consumption (CIRCE)

    2007-10-15

    CFD codes and advanced kinetics combustion models are extensively used to predict coal burnout in large utility boilers. Modelling approaches based on CFD codes can accurately solve the fluid dynamics equations involved in the problem but this is usually achieved by including simple combustion models. On the other hand, advanced kinetics combustion models can give a detailed description of the coal combustion behaviour by using a simplified description of the flow field, this usually being obtained from a zone-method approach. Both approximations describe correctly general trends on coal burnout, but fail to predict quantitative values. In this paper a new methodology which takes advantage of both approximations is described. In the first instance CFD solutions were obtained of the combustion conditions in the furnace in the Lamarmora power plant (ASM Brescia, Italy) for a number of different conditions and for three coals. Then, these furnace conditions were used as inputs for a more detailed chemical combustion model to predict coal burnout. In this, devolatilization was modelled using a commercial macromolecular network pyrolysis model (FG-DVC). For char oxidation an intrinsic reactivity approach including thermal annealing, ash inhibition and maceral effects, was used. Results from the simulations were compared against plant experimental values, showing a reasonable agreement in trends and quantitative values. 28 refs., 4 figs., 4 tabs.

  19. Cancer imaging phenomics toolkit: quantitative imaging analytics for precision diagnostics and predictive modeling of clinical outcome.

    Science.gov (United States)

    Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina

    2018-01-01

    The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.

  20. Qualitative and quantitative guidelines for the comparison of environmental model predictions

    International Nuclear Information System (INIS)

    Scott, M.

    1995-03-01

    The question of how to assess or compare predictions from a number of models is one of concern in the validation of models, in understanding the effects of different models and model parameterizations on model output, and ultimately in assessing model reliability. Comparison of model predictions with observed data is the basic tool of model validation while comparison of predictions amongst different models provides one measure of model credibility. The guidance provided here is intended to provide qualitative and quantitative approaches (including graphical and statistical techniques) to such comparisons for use within the BIOMOVS II project. It is hoped that others may find it useful. It contains little technical information on the actual methods but several references are provided for the interested reader. The guidelines are illustrated on data from the VAMP CB scenario. Unfortunately, these data do not permit all of the possible approaches to be demonstrated since predicted uncertainties were not provided. The questions considered are concerned with a) intercomparison of model predictions and b) comparison of model predictions with the observed data. A series of examples illustrating some of the different types of data structure and some possible analyses have been constructed. A bibliography of references on model validation is provided. It is important to note that the results of the various techniques discussed here, whether qualitative or quantitative, should not be considered in isolation. Overall model performance must also include an evaluation of model structure and formulation, i.e. conceptual model uncertainties, and results for performance measures must be interpreted in this context. Consider a number of models which are used to provide predictions of a number of quantities at a number of time points. In the case of the VAMP CB scenario, the results include predictions of total deposition of Cs-137 and time dependent concentrations in various

  1. Designing a Mathematical Model for Allocating Budget to University Research and Educational Goals: A Case Study in Shahed University

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2012-07-01

    Full Text Available Institutions of higher education, both public and private, are among the most important institutions of a country. Several economic factors have forced them to act for improving the cost-effectiveness of their activities and the quality of their products (outputs is strongly expected. Such issues have led universities to focus on profit-making activities and commercialization like manufacturing industries. This propensity is grounded in the fact that manufacturing industries working under an efficient management system can produce very high-quality products. As a matter of fact, there is no such a model for academic contexts. Therefore, this paper is aimed at offering such a model. The coefficients and constants used in this model have all been extracted based on analyzing research and educational aspects of Shahed University. The proposed model is a lexicographic model which has thirty six decision variables that are broken down into two classes of university sources variables (fifteen and university products variables. The model also includes forty nine goals, seven structural constraints and twenty integer variables. At the end of the paper, the current situation is compared with the recommended one and it shows that many of the variables are suboptimal except variables of research and educational officials (S9, graduate (P7 and PhD (P9 night course students number. The comprehensiveness of this model enables managers to plan the smallest research and educational activities and the solutions can be used by managers as applied guidelines.

  2. A quantitative dynamic systems model of health-related quality of life among older adults

    Science.gov (United States)

    Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela

    2015-01-01

    Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722

  3. Fixing the cracks in the crystal ball: A maturity model for quantitative risk assessment

    International Nuclear Information System (INIS)

    Rae, Andrew; Alexander, Rob; McDermid, John

    2014-01-01

    Quantitative risk assessment (QRA) is widely practiced in system safety, but there is insufficient evidence that QRA in general is fit for purpose. Defenders of QRA draw a distinction between poor or misused QRA and correct, appropriately used QRA, but this distinction is only useful if we have robust ways to identify the flaws in an individual QRA. In this paper we present a comprehensive maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature and in a collection of risk assessment peer reviews. We provide initial validation of the completeness and realism of the model. Our risk assessment maturity model provides a way to prioritise both process development within an organisation and empirical research within the QRA community. - Highlights: • Quantitative risk assessment (QRA) is widely practiced, but there is insufficient evidence that it is fit for purpose. • A given QRA may be good, or it may not – we need systematic ways to distinguish this. • We have created a maturity model for QRA which covers all the potential flaws discussed in the risk assessment literature. • We have provided initial validation of the completeness and realism of the model. • The maturity model can also be used to prioritise QRA research discipline-wide

  4. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    Science.gov (United States)

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (pquantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.

  5. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  6. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  7. Comparison of conventional, model-based quantitative planar, and quantitative SPECT image processing methods for organ activity estimation using In-111 agents

    International Nuclear Information System (INIS)

    He, Bin; Frey, Eric C

    2006-01-01

    Accurate quantification of organ radionuclide uptake is important for patient-specific dosimetry. The quantitative accuracy from conventional conjugate view methods is limited by overlap of projections from different organs and background activity, and attenuation and scatter. In this work, we propose and validate a quantitative planar (QPlanar) processing method based on maximum likelihood (ML) estimation of organ activities using 3D organ VOIs and a projector that models the image degrading effects. Both a physical phantom experiment and Monte Carlo simulation (MCS) studies were used to evaluate the new method. In these studies, the accuracies and precisions of organ activity estimates for the QPlanar method were compared with those from conventional planar (CPlanar) processing methods with various corrections for scatter, attenuation and organ overlap, and a quantitative SPECT (QSPECT) processing method. Experimental planar and SPECT projections and registered CT data from an RSD Torso phantom were obtained using a GE Millenium VH/Hawkeye system. The MCS data were obtained from the 3D NCAT phantom with organ activity distributions that modelled the uptake of 111 In ibritumomab tiuxetan. The simulations were performed using parameters appropriate for the same system used in the RSD torso phantom experiment. The organ activity estimates obtained from the CPlanar, QPlanar and QSPECT methods from both experiments were compared. From the results of the MCS experiment, even with ideal organ overlap correction and background subtraction, CPlanar methods provided limited quantitative accuracy. The QPlanar method with accurate modelling of the physical factors increased the quantitative accuracy at the cost of requiring estimates of the organ VOIs in 3D. The accuracy of QPlanar approached that of QSPECT, but required much less acquisition and computation time. Similar results were obtained from the physical phantom experiment. We conclude that the QPlanar method, based

  8. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  9. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  10. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    Science.gov (United States)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  11. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong; Zhao, Weishu; Chang, Frank; Dyer, Steve

    2013-01-01

    Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  12. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    Science.gov (United States)

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  13. Quantitative structure-activity relationship (QSAR) for insecticides: development of predictive in vivo insecticide activity models.

    Science.gov (United States)

    Naik, P K; Singh, T; Singh, H

    2009-07-01

    Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.

  14. Nuclear security culture: a generic model for universal application

    International Nuclear Information System (INIS)

    Khripunov, I.

    2005-01-01

    Full text: Nuclear security culture found its way into professional parlance several years ago, but still lacks an agreed-upon definition and description. The February 2005 U.S.-Russian Joint Statement, issued at the presidential summit meeting in Bratislava, referred specifically to security culture, focusing renewed attention on the concept. Numerous speakers at the March 2005 International Atomic Energy Agency's (IAEA) international conference on nuclear security referred to security culture, but their visions and interpretations were often at odds with one another. Clearly, there is a need for a generic model of nuclear security culture with universal applicability. Internationally acceptable standards in this area would be invaluable for evaluation, comparison, cooperation, and assistance. They would also help international bodies better manage their relations with the nuclear sectors in various countries. This paper will develop such a model. It will use the IAEA definition of nuclear security, and then apply Edgar Schein's model of organizational culture to security culture at a generic nuclear facility. A cultural approach to physical protection involves determining what attitudes and beliefs need to be established in an organization, how these attitudes and beliefs manifest themselves in the behavior of assigned personnel, and how desirable attitudes and beliefs can be transcribed into formal working methods to produce good outcomes, i.e., effective protection. The security-culture mechanism I will propose is broken into four major units: facility leadership, proactive policies and procedures, personnel performance, and learning and professional improvement. The paper will amplify on the specific traits characteristic of each of these units. Security culture is not a panacea. In a time of mounting terrorist threats, it should nonetheless be looked upon as a necessary organizational tool that enhances the skills of nuclear personnel and ensures that

  15. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... detailed information on the estimated health impact in a given exposure situation. These graphs will facilitate the discussions on appropriate risk reduction measures to be taken....

  16. Towards a Model of a Critical Pedagogy in Malawian Universities ...

    African Journals Online (AJOL)

    Quality university education is important for achieving national aspirations as stated in higher education policy frameworks in Malawi. The major education policy documents in Malawi: The Policy and Investment Framework and the Malawi National Education Sector Plan recognise the importance of university education for ...

  17. Efficient Generation and Selection of Virtual Populations in Quantitative Systems Pharmacology Models.

    Science.gov (United States)

    Allen, R J; Rieger, T R; Musante, C J

    2016-03-01

    Quantitative systems pharmacology models mechanistically describe a biological system and the effect of drug treatment on system behavior. Because these models rarely are identifiable from the available data, the uncertainty in physiological parameters may be sampled to create alternative parameterizations of the model, sometimes termed "virtual patients." In order to reproduce the statistics of a clinical population, virtual patients are often weighted to form a virtual population that reflects the baseline characteristics of the clinical cohort. Here we introduce a novel technique to efficiently generate virtual patients and, from this ensemble, demonstrate how to select a virtual population that matches the observed data without the need for weighting. This approach improves confidence in model predictions by mitigating the risk that spurious virtual patients become overrepresented in virtual populations.

  18. Quantitative assessment of manual and robotic microcannulation for eye surgery using new eye model.

    Science.gov (United States)

    Tanaka, Shinichi; Harada, Kanako; Ida, Yoshiki; Tomita, Kyohei; Kato, Ippei; Arai, Fumihito; Ueta, Takashi; Noda, Yasuo; Sugita, Naohiko; Mitsuishi, Mamoru

    2015-06-01

    Microcannulation, a surgical procedure for the eye that requires drug injection into a 60-90 µm retinal vein, is difficult to perform manually. Robotic assistance has been proposed; however, its effectiveness in comparison to manual operation has not been quantified. An eye model has been developed to quantify the performance of manual and robotic microcannulation. The eye model, which is implemented with a force sensor and microchannels, also simulates the mechanical constraints of the instrument's movement. Ten subjects performed microcannulation using the model, with and without robotic assistance. The results showed that the robotic assistance was useful for motion stability when the drug was injected, whereas its positioning accuracy offered no advantage. An eye model was used to quantitatively assess the robotic microcannulation performance in comparison to manual operation. This approach could be valid for a better evaluation of surgical robotic assistance. Copyright © 2014 John Wiley & Sons, Ltd.

  19. Evaluation of aluminum pit corrosion in oak ridge research reactor pool by quantitative imaging and thermodynamic modeling

    International Nuclear Information System (INIS)

    Jang, Ping-Rey; Arunkumar, Rangaswami; Lindner, Jeffrey S.; Long, Zhiling; Mott, Melissa A.; Okhuysen, Walter P.; Monts, David L.; Su, Yi; Kirk, Paula G.; Ettien, John

    2007-01-01

    The Oak Ridge Research Reactor (ORRR) was operated as an isotope production and irradiation facility from March 1958 until March 1987. The US Department of Energy permanently shut down and removed the fuel from the ORRR in 1987. The water level must be maintained in the ORRR pool as shielding for radioactive components still located in the pool. The U.S. Department of Energy's Office of Environmental Management (DOE EM) needs to decontaminate and demolish the ORRR as part of the Oak Ridge cleanup program. In February 2004, increased pit corrosion was noted in the pool's 6 mm (1/4'')-thick aluminum liner in the section nearest where the radioactive components are stored. If pit corrosion has significantly penetrated the aluminum liner, then DOE EM must accelerate its decontaminating and decommissioning (D and D) efforts or look for alternatives for shielding the irradiated components. The goal of Mississippi State University's Institute for Clean Energy Technology (ICET) was to provide a determination of the extent and depth of corrosion and to conduct thermodynamic modeling to determine how further corrosion can be inhibited. Results from the work will facilitate ORNL in making reliable disposition decisions. ICET's inspection approach was to quantitatively estimate the amount of corrosion by using Fourier - transform profilometry (FTP). FTP is a non-contact 3- D shape measurement technique. By projecting a fringe pattern onto a target surface and observing its deformation due to surface irregularities from a different view angle, the system is capable of determining the height (depth) distribution of the target surface, thus reproducing the profile of the target accurately. ICET has previously demonstrated that its FTP system can quantitatively estimate the volume and depth of removed and residual material to high accuracy. The results of our successful initial deployment of a submergible FTP system into the ORRR pool are reported here as are initial thermodynamic

  20. Quantitative Decision Making Model for Carbon Reduction in Road Construction Projects Using Green Technologies

    Directory of Open Access Journals (Sweden)

    Woosik Jang

    2015-08-01

    Full Text Available Numerous countries have established policies for reducing greenhouse gas emissions and have suggested goals pertaining to these reductions. To reach the target reduction amounts, studies on the reduction of carbon emissions have been conducted with regard to all stages and processes in construction projects. According to a study on carbon emissions, the carbon emissions generated during the construction stage of road projects account for approximately 76 to 86% of the total carbon emissions, far exceeding the other stages, such as maintenance or demolition. Therefore, this study aims to develop a quantitative decision making model that supports the application of green technologies (GTs to reduce carbon emissions during the construction stage of road construction projects. First, the authors selected environmental soundness, economic feasibility and constructability as the key assessment indices for evaluating 20 GTs. Second, a fuzzy set/qualitative comparative analysis (FS/QCA was used to establish an objective decision-making model for the assessment of both the quantitative and qualitative characteristics of the key indices. To support the developed model, an expert survey was performed to assess the applicability of each GT from a practical perspective, which was verified with a case study using two additional GTs. The proposed model is expected to support practitioners in the application of suitable GTs to road projects and reduce carbon emissions, resulting in better decision making during road construction projects.

  1. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    Science.gov (United States)

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  2. Quantitative Analysis of the Security of Software-Defined Network Controller Using Threat/Effort Model

    Directory of Open Access Journals (Sweden)

    Zehui Wu

    2017-01-01

    Full Text Available SDN-based controller, which is responsible for the configuration and management of the network, is the core of Software-Defined Networks. Current methods, which focus on the secure mechanism, use qualitative analysis to estimate the security of controllers, leading to inaccurate results frequently. In this paper, we employ a quantitative approach to overcome the above shortage. Under the analysis of the controller threat model we give the formal model results of the APIs, the protocol interfaces, and the data items of controller and further provide our Threat/Effort quantitative calculation model. With the help of Threat/Effort model, we are able to compare not only the security of different versions of the same kind controller but also different kinds of controllers and provide a basis for controller selection and secure development. We evaluated our approach in four widely used SDN-based controllers which are POX, OpenDaylight, Floodlight, and Ryu. The test, which shows the similarity outcomes with the traditional qualitative analysis, demonstrates that with our approach we are able to get the specific security values of different controllers and presents more accurate results.

  3. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    Science.gov (United States)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  4. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    Directory of Open Access Journals (Sweden)

    Mondry Adrian

    2004-08-01

    Full Text Available Abstract Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods

  5. Toward University Modeling Instruction--Biology: Adapting Curricular Frameworks from Physics to Biology

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER)…

  6. Nonparametric modeling of longitudinal covariance structure in functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yap, John Stephen; Fan, Jianqing; Wu, Rongling

    2009-12-01

    Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.

  7. Quantitative laser diagnostic and modeling study of C2 and CH chemistry in combustion.

    Science.gov (United States)

    Köhler, Markus; Brockhinke, Andreas; Braun-Unkhoff, Marina; Kohse-Höinghaus, Katharina

    2010-04-15

    Quantitative concentration measurements of CH and C(2) have been performed in laminar, premixed, flat flames of propene and cyclopentene with varying stoichiometry. A combination of cavity ring-down (CRD) spectroscopy and laser-induced fluorescence (LIF) was used to enable sensitive detection of these species with high spatial resolution. Previously, CH and C(2) chemistry had been studied, predominantly in methane flames, to understand potential correlations of their formation and consumption. For flames of larger hydrocarbon fuels, however, quantitative information on these small intermediates is scarce, especially under fuel-rich conditions. Also, the combustion chemistry of C(2) in particular has not been studied in detail, and although it has often been observed, its role in potential build-up reactions of higher hydrocarbon species is not well understood. The quantitative measurements performed here are the first to detect both species with good spatial resolution and high sensitivity in the same experiment in flames of C(3) and C(5) fuels. The experimental profiles were compared with results of combustion modeling to reveal details of the formation and consumption of these important combustion molecules, and the investigation was devoted to assist the further understanding of the role of C(2) and of its potential chemical interdependences with CH and other small radicals.

  8. Establishing a Business Process Reference Model for Universities

    KAUST Repository

    Svensson, Carsten; Hvolby, Hans-Henrik

    2012-01-01

    Reference), DCOR (Design Chain Operations Reference) and ITIL (Information Technology Infrastructure Library) have gained popularity among organizations in both the private and public sectors. We speculate that this success can be replicated in a university

  9. Using Predictive Modelling to Identify Students at Risk of Poor University Outcomes

    Science.gov (United States)

    Jia, Pengfei; Maloney, Tim

    2015-01-01

    Predictive modelling is used to identify students at risk of failing their first-year courses and not returning to university in the second year. Our aim is twofold. Firstly, we want to understand the factors that lead to poor first-year experiences at university. Secondly, we want to develop simple, low-cost tools that would allow universities to…

  10. Quantitative model for the generic 3D shape of ICMEs at 1 AU

    Science.gov (United States)

    Démoulin, P.; Janvier, M.; Masías-Meza, J. J.; Dasso, S.

    2016-10-01

    Context. Interplanetary imagers provide 2D projected views of the densest plasma parts of interplanetary coronal mass ejections (ICMEs), while in situ measurements provide magnetic field and plasma parameter measurements along the spacecraft trajectory, that is, along a 1D cut. The data therefore only give a partial view of the 3D structures of ICMEs. Aims: By studying a large number of ICMEs, crossed at different distances from their apex, we develop statistical methods to obtain a quantitative generic 3D shape of ICMEs. Methods: In a first approach we theoretically obtained the expected statistical distribution of the shock-normal orientation from assuming simple models of 3D shock shapes, including distorted profiles, and compared their compatibility with observed distributions. In a second approach we used the shock normal and the flux rope axis orientations together with the impact parameter to provide statistical information across the spacecraft trajectory. Results: The study of different 3D shock models shows that the observations are compatible with a shock that is symmetric around the Sun-apex line as well as with an asymmetry up to an aspect ratio of around 3. Moreover, flat or dipped shock surfaces near their apex can only be rare cases. Next, the sheath thickness and the ICME velocity have no global trend along the ICME front. Finally, regrouping all these new results and those of our previous articles, we provide a quantitative ICME generic 3D shape, including the global shape of the shock, the sheath, and the flux rope. Conclusions: The obtained quantitative generic ICME shape will have implications for several aims. For example, it constrains the output of typical ICME numerical simulations. It is also a base for studying the transport of high-energy solar and cosmic particles during an ICME propagation as well as for modeling and forecasting space weather conditions near Earth.

  11. Knowledge Management in Nigerian Universities: A Conceptual Model

    OpenAIRE

    Adebowale I Ojo

    2016-01-01

    Universities have traditionally been leaders in the field of knowledge production, research, and societal development. They are expected to be drivers of innovation, thereby contributing to the development of a learning society. The array of challenges facing universities in Nigeria and other developing countries forces one to question their levels of innovation. While knowledge management has been identified as a strategy for driving innovative processes in business organizations, there is a...

  12. User Requirements Model for University Timetable Management System

    OpenAIRE

    Ahmad Althunibat; Mohammad I. Muhairat

    2016-01-01

    Automated timetables are used to schedule courses, lectures and rooms in universities by considering some constraints. Inconvenient and ineffective timetables often waste time and money. Therefore, it is important to investigate the requirements and potential needs of users. Thus, eliciting user requirements of University Timetable Management System (TMS) and their implication becomes an important process for the implementation of TMS. On this note, this paper seeks to propose a m...

  13. Tumour-cell killing by X-rays and immunity quantitated in a mouse model system

    International Nuclear Information System (INIS)

    Porteous, D.D.; Porteous, K.M.; Hughes, M.J.

    1979-01-01

    As part of an investigation of the interaction of X-rays and immune cytotoxicity in tumour control, an experimental mouse model system has been used in which quantitative anti-tumour immunity was raised in prospective recipients of tumour-cell suspensions exposed to varying doses of X-rays in vitro before injection. Findings reported here indicate that, whilst X-rays kill a proportion of cells, induced immunity deals with a fixed number dependent upon the immune status of the host, and that X-rays and anti-tumour immunity do not act synergistically in tumour-cell killing. The tumour used was the ascites sarcoma BP8. (author)

  14. Impact Assessment of Abiotic Resources in LCA: Quantitative Comparison of Selected Characterization Models

    DEFF Research Database (Denmark)

    Rørbech, Jakob Thaysen; Vadenbo, Carl; Hellweg, Stefanie

    2014-01-01

    Resources have received significant attention in recent years resulting in development of a wide range of resource depletion indicators within life cycle assessment (LCA). Understanding the differences in assessment principles used to derive these indicators and the effects on the impact assessment...... results is critical for indicator selection and interpretation of the results. Eleven resource depletion methods were evaluated quantitatively with respect to resource coverage, characterization factors (CF), impact contributions from individual resources, and total impact scores. We included 2247...... groups, according to method focus and modeling approach, to aid method selection within LCA....

  15. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  16. A quantitative speciation model for the adsorption of organic pollutants on activated carbon.

    Science.gov (United States)

    Grivé, M; García, D; Domènech, C; Richard, L; Rojo, I; Martínez, X; Rovira, M

    2013-01-01

    Granular activated carbon (GAC) is commonly used as adsorbent in water treatment plants given its high capacity for retaining organic pollutants in aqueous phase. The current knowledge on GAC behaviour is essentially empirical, and no quantitative description of the chemical relationships between GAC surface groups and pollutants has been proposed. In this paper, we describe a quantitative model for the adsorption of atrazine onto GAC surface. The model is based on results of potentiometric titrations and three types of adsorption experiments which have been carried out in order to determine the nature and distribution of the functional groups on the GAC surface, and evaluate the adsorption characteristics of GAC towards atrazine. Potentiometric titrations have indicated the existence of at least two different families of chemical groups on the GAC surface, including phenolic- and benzoic-type surface groups. Adsorption experiments with atrazine have been satisfactorily modelled with the geochemical code PhreeqC, assuming that atrazine is sorbed onto the GAC surface in equilibrium (log Ks = 5.1 ± 0.5). Independent thermodynamic calculations suggest a possible adsorption of atrazine on a benzoic derivative. The present work opens a new approach for improving the adsorption capabilities of GAC towards organic pollutants by modifying its chemical properties.

  17. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  18. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity.

  19. Spatiotemporal microbiota dynamics from quantitative in vitro and in silico models of the gut

    Science.gov (United States)

    Hwa, Terence

    The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth behaviors, which ultimately dictate the gut microbiota composition. Combining measurements of bacterial growth physiology with analysis of published data on human physiology into a quantitative modeling framework, we show how hydrodynamic forces in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla in the gut. Our model quantitatively explains the observed variation of microbiota composition among healthy adults, and predicts colonic water absorption (manifested as stool consistency) and nutrient intake to be two key factors determining this composition. The model further reveals that both factors, which have been identified in recent correlative studies, exert their effects through the same mechanism: changes in colonic pH that differentially affect the growth of different bacteria. Our findings show that a predictive and mechanistic understanding of microbial ecology in the human gut is possible, and offer the hope for the rational design of intervention strategies to actively control the microbiota. This work is supported by the Bill and Melinda Gates Foundation.

  20. Model development for quantitative evaluation of proliferation resistance of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Ko, Won Il; Kim, Ho Dong; Yang, Myung Seung

    2000-07-01

    This study addresses the quantitative evaluation of the proliferation resistance which is important factor of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. The analysis on the proliferation resistance of nuclear fuel cycles has shown that the resistance index as defined herein can be used as an international measure of the relative risk of the nuclear proliferation if the motivation index is appropriately defined. It has also shown that the proposed model can include political issues as well as technical ones relevant to the proliferation resistance, and consider all facilities and activities in a specific nuclear fuel cycle (from mining to disposal). In addition, sensitivity analyses on the sample study indicate that the direct disposal option in a country with high nuclear propensity may give rise to a high risk of the nuclear proliferation than the reprocessing option in a country with low nuclear propensity

  1. A BLUEPRINT OF SOFTWARE ENABLED QUANTITATIVE MEASUREMENT OF PROGRAMME OUTCOMES: A CASE STUDY FOR TAYLOR’S UNIVERSITY

    Directory of Open Access Journals (Sweden)

    REYNATO ANDAL GAMBOA

    2013-04-01

    Full Text Available Lecturers are fully occupied with many tasks including preparing teaching materials, exam papers, lab sheets, markings, research, and administrative support tasks required of them to maintain high standard teaching delivery and good quality management system in the school. Aside from these, they are now required to do intensive Outcome-Based Education (OBE assessments, and Continual Quality Improvement (CQI planning and implementation. An automated OBE assessment tool is therefore required to ease the burden among the lecturers and provide a standard method of assessment. To assist in this process, this paper presents a blueprint of a software-enabled quantitative measurement of the Learning Outcomes (LO and the Programme Outcomes (PO in the module level. The blueprint consists of macro-enabled worksheets that automatically calculate the students’ individual LO and PO attainments based on their respective module assessment marks whereby the lecturer only need to key-in the subject details of assessments-LO mapping, LO-PO mapping and the students’ assessment marks. Once the marks are in place, LO and PO attainments are calculated automatically to provide the corresponding bar charts based on the individual attainments of the students. A LO or a PO is said to be attained when the number of students achieved the Key Performance Index (KPI set by the department. The results will then be used by the lecturer to prepare an annual module review and prepare a CQI plan for the next semester.

  2. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  3. A quantitative trait locus mixture model that avoids spurious LOD score peaks.

    Science.gov (United States)

    Feenstra, Bjarke; Skovgaard, Ib M

    2004-06-01

    In standard interval mapping of quantitative trait loci (QTL), the QTL effect is described by a normal mixture model. At any given location in the genome, the evidence of a putative QTL is measured by the likelihood ratio of the mixture model compared to a single normal distribution (the LOD score). This approach can occasionally produce spurious LOD score peaks in regions of low genotype information (e.g., widely spaced markers), especially if the phenotype distribution deviates markedly from a normal distribution. Such peaks are not indicative of a QTL effect; rather, they are caused by the fact that a mixture of normals always produces a better fit than a single normal distribution. In this study, a mixture model for QTL mapping that avoids the problems of such spurious LOD score peaks is presented.

  4. A quantitative microbial risk assessment model for Listeria monocytogenes in RTE sandwiches

    DEFF Research Database (Denmark)

    Tirloni, E.; Stella, S.; de Knegt, Leonardo

    2018-01-01

    within each serving. Then, two dose-response models were alternatively applied: the first used a fixed r value for each of the three population groups, while the second considered a variable r value (lognormal distribution), taking into account the variability in strain virulence and different host...... subpopulations susceptibility. The stochastic model predicted zero cases for total population for both the substrates by using the fixed r approach, while 3 cases were expected when a higher variability (in virulence and susceptibility) was considered in the model; the number of cases increased to 45......A Quantitative Microbial Risk Assessment (QMRA) was performed to estimate the expected number of listeriosis cases due to the consumption, on the last day of shelf life, of 20 000 servings of multi-ingredient sandwiches produced by a medium scale food producer in Italy, by different population...

  5. A likely universal model of fracture scaling and its consequence for crustal hydromechanics

    Science.gov (United States)

    Davy, P.; Le Goc, R.; Darcel, C.; Bour, O.; de Dreuzy, J. R.; Munier, R.

    2010-10-01

    We argue that most fracture systems are spatially organized according to two main regimes: a "dilute" regime for the smallest fractures, where they can grow independently of each other, and a "dense" regime for which the density distribution is controlled by the mechanical interactions between fractures. We derive a density distribution for the dense regime by acknowledging that, statistically, fractures do not cross a larger one. This very crude rule, which expresses the inhibiting role of large fractures against smaller ones but not the reverse, actually appears be a very strong control on the eventual fracture density distribution since it results in a self-similar distribution whose exponents and density term are fully determined by the fractal dimension D and a dimensionless parameter γ that encompasses the details of fracture correlations and orientations. The range of values for D and γ appears to be extremely limited, which makes this model quite universal. This theory is supported by quantitative data on either fault or joint networks. The transition between the dilute and dense regimes occurs at about a few tenths of a kilometer for faults systems and a few meters for joints. This remarkable difference between both processes is likely due to a large-scale control (localization) of the fracture growth for faulting that does not exist for jointing. Finally, we discuss the consequences of this model on the flow properties and show that these networks are in a critical state, with a large number of nodes carrying a large amount of flow.

  6. NEAMS-Funded University Research in Support of TREAT Modeling and Simulation, FY15

    Energy Technology Data Exchange (ETDEWEB)

    Dehart, Mark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mausolff, Zander [Univ. of Florida, Gainesville, FL (United States); Goluoglu, Sedat [Univ. of Florida, Gainesville, FL (United States); Prince, Zach [Texas A & M Univ., College Station, TX (United States); Ragusa, Jean [Texas A & M Univ., College Station, TX (United States); Haugen, Carl [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Ellis, Matt [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Forget, Benoit [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Smith, Kord [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Alberti, Anthony [Oregon State Univ., Corvallis, OR (United States); Palmer, Todd [Oregon State Univ., Corvallis, OR (United States)

    2015-09-01

    This report summarizes university research activities performed in support of TREAT modeling and simulation research. It is a compilation of annual research reports from four universities: University of Florida, Texas A&M University, Massachusetts Institute of Technology and Oregon State University. The general research topics are, respectively, (1) 3-D time-dependent transport with TDKENO/KENO-VI, (2) implementation of the Improved Quasi-Static method in Rattlesnake/MOOSE for time-dependent radiation transport approximations, (3) improved treatment of neutron physics representations within TREAT using OpenMC, and (4) steady state modeling of the minimum critical core of the Transient Reactor Test Facility (TREAT).

  7. NEAMS-Funded University Research in Support of TREAT Modeling and Simulation, FY15

    International Nuclear Information System (INIS)

    Dehart, Mark; Mausolff, Zander; Goluoglu, Sedat; Prince, Zach; Ragusa, Jean; Haugen, Carl; Ellis, Matt; Forget, Benoit; Smith, Kord; Alberti, Anthony; Palmer, Todd

    2015-01-01

    This report summarizes university research activities performed in support of TREAT modeling and simulation research. It is a compilation of annual research reports from four universities: University of Florida, Texas A&M University, Massachusetts Institute of Technology and Oregon State University. The general research topics are, respectively, (1) 3-D time-dependent transport with TDKENO/KENO-VI, (2) implementation of the Improved Quasi-Static method in Rattlesnake/MOOSE for time-dependent radiation transport approximations, (3) improved treatment of neutron physics representations within TREAT using OpenMC, and (4) steady state modeling of the minimum critical core of the Transient Reactor Test Facility (TREAT).

  8. Quantitative model for the blood pressure‐lowering interaction of valsartan and amlodipine

    Science.gov (United States)

    Heo, Young‐A; Holford, Nick; Kim, Yukyung; Son, Mijeong

    2016-01-01

    Aims The objective of this study was to develop a population pharmacokinetic (PK) and pharmacodynamic (PD) model to quantitatively describe the antihypertensive effect of combined therapy with amlodipine and valsartan. Methods PK modelling was used with data collected from 48 healthy volunteers receiving a single dose of combined formulation of 10 mg amlodipine and 160 mg valsartan. Systolic (SBP) and diastolic blood pressure (DBP) were recorded during combined administration. SBP and DBP data for each drug alone were gathered from the literature. PKPD models of each drug and for combined administration were built with NONMEM 7.3. Results A two‐compartment model with zero order absorption best described the PK data of both drugs. Amlodipine and valsartan monotherapy effects on SBP and DBP were best described by an I max model with an effect compartment delay. Combined therapy was described using a proportional interaction term as follows: (D1 + D2) +ALPHA×(D1 × D2). D1 and D2 are the predicted drug effects of amlodipine and valsartan monotherapy respectively. ALPHA is the interaction term for combined therapy. Quantitative estimates of ALPHA were −0.171 (95% CI: −0.218, −0.143) for SBP and −0.0312 (95% CI: −0.07739, −0.00283) for DBP. These infra‐additive interaction terms for both SBP and DBP were consistent with literature results for combined administration of drugs in these classes. Conclusion PKPD models for SBP and DBP successfully described the time course of the antihypertensive effects of amlodipine and valsartan. An infra‐additive interaction between amlodipine and valsartan when used in combined administration was confirmed and quantified. PMID:27504853

  9. Quantitative Hydraulic Models Of Early Land Plants Provide Insight Into Middle Paleozoic Terrestrial Paleoenvironmental Conditions

    Science.gov (United States)

    Wilson, J. P.; Fischer, W. W.

    2010-12-01

    Fossil plants provide useful proxies of Earth’s climate because plants are closely connected, through physiology and morphology, to the environments in which they lived. Recent advances in quantitative hydraulic models of plant water transport provide new insight into the history of climate by allowing fossils to speak directly to environmental conditions based on preserved internal anatomy. We report results of a quantitative hydraulic model applied to one of the earliest terrestrial plants preserved in three dimensions, the ~396 million-year-old vascular plant Asteroxylon mackei. This model combines equations describing the rate of fluid flow through plant tissues with detailed observations of plant anatomy; this allows quantitative estimates of two critical aspects of plant function. First and foremost, results from these models quantify the supply of water to evaporative surfaces; second, results describe the ability of plant vascular systems to resist tensile damage from extreme environmental events, such as drought or frost. This approach permits quantitative comparisons of functional aspects of Asteroxylon with other extinct and extant plants, informs the quality of plant-based environmental proxies, and provides concrete data that can be input into climate models. Results indicate that despite their small size, water transport cells in Asteroxylon could supply a large volume of water to the plant's leaves--even greater than cells from some later-evolved seed plants. The smallest Asteroxylon tracheids have conductivities exceeding 0.015 m^2 / MPa * s, whereas Paleozoic conifer tracheids do not reach this threshold until they are three times wider. However, this increase in conductivity came at the cost of little to no adaptations for transport safety, placing the plant’s vegetative organs in jeopardy during drought events. Analysis of the thickness-to-span ratio of Asteroxylon’s tracheids suggests that environmental conditions of reduced relative

  10. Inservice trainings for Shiraz University of Medical Sciences employees: effectiveness assessment by using the CIPP model

    Directory of Open Access Journals (Sweden)

    MARYAM MOKHTARZADEGAN

    2015-04-01

    Full Text Available Introduction: Nowadays, the employees` inservice training has become one of the core components in survival and success of any organization. Unfortunately, despite the importance of training evaluation, a small portion of resources are allocated to this matter. Among many evaluation models, the CIPP model or Context, Input, Process, Product model is a very useful approach to educational evaluation. So far, the evaluation of the training courses mostly provided information for learners but this investigation aims at evaluating the effectiveness of the experts’ training programs and identifying its pros and cons based on the 4 stages of the CIPP model. Method: In this descriptive analytical study, done in 2013, 250 employees of Shiraz University Medical Sciences (SUMS participated in inservice training courses were randomly selected. The evaluated variables were designed using CIPP model and a researcher-made questionnaire was used for data collection; the questionnaire was validated using expert opinion and its reliability was confirmed by Cronbach’s alpha (0.89. Quantitative data were analyzed using SPSS 14 and statistical tests was done as needed. Results: In the context phase, the mean score was highest in solving work problems (4.07±0.88 and lowest in focusing on learners’ learning style training courses (2.68±0.91. There is a statistically significant difference between the employees` education level and the product phase evaluation (p0.001, in contrast with the process and product phase which showed a significant deference (p<0.001. Conclusion: Considering our results, although the inservice trainings given to sums employees has been effective in many ways, it has some weaknesses as well. Therefore improving these weaknesses and reinforcing strong points within the identified fields in this study should be taken into account by decision makers and administrators.

  11. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    Science.gov (United States)

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  12. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H.; Broersen, Alexander

    2017-01-01

    Purpose: Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model- based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) im- ages on quantitative measurements in coronary arteries for plaque volumes and intensities. Methods...

  13. Development of quantitative atomic modeling for tungsten transport study using LHD plasma with tungsten pellet injection

    Science.gov (United States)

    Murakami, I.; Sakaue, H. A.; Suzuki, C.; Kato, D.; Goto, M.; Tamura, N.; Sudo, S.; Morita, S.

    2015-09-01

    Quantitative tungsten study with reliable atomic modeling is important for successful achievement of ITER and fusion reactors. We have developed tungsten atomic modeling for understanding the tungsten behavior in fusion plasmas. The modeling is applied to the analysis of tungsten spectra observed from plasmas of the large helical device (LHD) with tungsten pellet injection. We found that extreme ultraviolet (EUV) emission of W24+ to W33+ ions at 1.5-3.5 nm are sensitive to electron temperature and useful to examine the tungsten behavior in edge plasmas. We can reproduce measured EUV spectra at 1.5-3.5 nm by calculated spectra with the tungsten atomic model and obtain charge state distributions of tungsten ions in LHD plasmas at different temperatures around 1 keV. Our model is applied to calculate the unresolved transition array (UTA) seen at 4.5-7 nm tungsten spectra. We analyze the effect of configuration interaction on population kinetics related to the UTA structure in detail and find the importance of two-electron-one-photon transitions between 4p54dn+1- 4p64dn-14f. Radiation power rate of tungsten due to line emissions is also estimated with the model and is consistent with other models within factor 2.

  14. Quantitative computational models of molecular self-assembly in systems biology.

    Science.gov (United States)

    Thomas, Marcus; Schwartz, Russell

    2017-05-23

    Molecular self-assembly is the dominant form of chemical reaction in living systems, yet efforts at systems biology modeling are only beginning to appreciate the need for and challenges to accurate quantitative modeling of self-assembly. Self-assembly reactions are essential to nearly every important process in cell and molecular biology and handling them is thus a necessary step in building comprehensive models of complex cellular systems. They present exceptional challenges, however, to standard methods for simulating complex systems. While the general systems biology world is just beginning to deal with these challenges, there is an extensive literature dealing with them for more specialized self-assembly modeling. This review will examine the challenges of self-assembly modeling, nascent efforts to deal with these challenges in the systems modeling community, and some of the solutions offered in prior work on self-assembly specifically. The review concludes with some consideration of the likely role of self-assembly in the future of complex biological system models more generally.

  15. Quantitative Modeling of Acid Wormholing in Carbonates- What Are the Gaps to Bridge

    KAUST Repository

    Qiu, Xiangdong

    2013-01-01

    Carbonate matrix acidization extends a well\\'s effective drainage radius by dissolving rock and forming conductive channels (wormholes) from the wellbore. Wormholing is a dynamic process that involves balance between the acid injection rate and reaction rate. Generally, injection rate is well defined where injection profiles can be controlled, whereas the reaction rate can be difficult to obtain due to its complex dependency on interstitial velocity, fluid composition, rock surface properties etc. Conventional wormhole propagation models largely ignore the impact of reaction products. When implemented in a job design, the significant errors can result in treatment fluid schedule, rate, and volume. A more accurate method to simulate carbonate matrix acid treatments would accomodate the effect of reaction products on reaction kinetics. It is the purpose of this work to properly account for these effects. This is an important step in achieving quantitative predictability of wormhole penetration during an acidzing treatment. This paper describes the laboratory procedures taken to obtain the reaction-product impacted kinetics at downhole conditions using a rotating disk apparatus, and how this new set of kinetics data was implemented in a 3D wormholing model to predict wormhole morphology and penetration velocity. The model explains some of the differences in wormhole morphology observed in limestone core flow experiments where injection pressure impacts the mass transfer of hydrogen ions to the rock surface. The model uses a CT scan rendered porosity field to capture the finer details of the rock fabric and then simulates the fluid flow through the rock coupled with reactions. Such a validated model can serve as a base to scale up to near wellbore reservoir and 3D radial flow geometry allowing a more quantitative acid treatment design.

  16. Qualitative and quantitative examination of the performance of regional air quality models representing different modeling approaches

    International Nuclear Information System (INIS)

    Bhumralkar, C.M.; Ludwig, F.L.; Shannon, J.D.; McNaughton, D.

    1985-04-01

    The calculations of three different air quality models were compared with the best available observations. The comparisons were made without calibrating the models to improve agreement with the observations. Model performance was poor for short averaging times (less than 24 hours). Some of the poor performance can be traced to errors in the input meteorological fields, but error exist on all levels. It should be noted that these models were not originally designed for treating short-term episodes. For short-term episodes, much of the variance in the data can arise from small spatial scale features that tend to be averaged out over longer periods. These small spatial scale features cannot be resolved with the coarse grids that are used for the meteorological and emissions inputs. Thus, it is not surprising that the models performed for the longer averaging times. The models compared were RTM-II, ENAMAP-2 and ACID. (17 refs., 5 figs., 4 tabs

  17. Reservoir architecture modeling: Nonstationary models for quantitative geological characterization. Final report, April 30, 1998

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, D.; Epili, D.; Kelkar, M.; Redner, R.; Reynolds, A.

    1998-12-01

    The study was comprised of four investigations: facies architecture; seismic modeling and interpretation; Markov random field and Boolean models for geologic modeling of facies distribution; and estimation of geological architecture using the Bayesian/maximum entropy approach. This report discusses results from all four investigations. Investigations were performed using data from the E and F units of the Middle Frio Formation, Stratton Field, one of the major reservoir intervals in the Gulf Coast Basin.

  18. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    Science.gov (United States)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  19. Quantitative evaluation and modeling of two-dimensional neovascular network complexity: the surface fractal dimension

    International Nuclear Information System (INIS)

    Grizzi, Fabio; Russo, Carlo; Colombo, Piergiuseppe; Franceschini, Barbara; Frezza, Eldo E; Cobos, Everardo; Chiriva-Internati, Maurizio

    2005-01-01

    Modeling the complex development and growth of tumor angiogenesis using mathematics and biological data is a burgeoning area of cancer research. Architectural complexity is the main feature of every anatomical system, including organs, tissues, cells and sub-cellular entities. The vascular system is a complex network whose geometrical characteristics cannot be properly defined using the principles of Euclidean geometry, which is only capable of interpreting regular and smooth objects that are almost impossible to find in Nature. However, fractal geometry is a more powerful means of quantifying the spatial complexity of real objects. This paper introduces the surface fractal dimension (D s ) as a numerical index of the two-dimensional (2-D) geometrical complexity of tumor vascular networks, and their behavior during computer-simulated changes in vessel density and distribution. We show that D s significantly depends on the number of vessels and their pattern of distribution. This demonstrates that the quantitative evaluation of the 2-D geometrical complexity of tumor vascular systems can be useful not only to measure its complex architecture, but also to model its development and growth. Studying the fractal properties of neovascularity induces reflections upon the real significance of the complex form of branched anatomical structures, in an attempt to define more appropriate methods of describing them quantitatively. This knowledge can be used to predict the aggressiveness of malignant tumors and design compounds that can halt the process of angiogenesis and influence tumor growth

  20. Preclinical Magnetic Resonance Fingerprinting (MRF) at 7 T: Effective Quantitative Imaging for Rodent Disease Models

    Science.gov (United States)

    Gao, Ying; Chen, Yong; Ma, Dan; Jiang, Yun; Herrmann, Kelsey A.; Vincent, Jason A.; Dell, Katherine M.; Drumm, Mitchell L.; Brady-Kalnay, Susann M.; Griswold, Mark A.; Flask, Chris A.; Lu, Lan

    2015-01-01

    High field, preclinical magnetic resonance imaging (MRI) scanners are now commonly used to quantitatively assess disease status and efficacy of novel therapies in a wide variety of rodent models. Unfortunately, conventional MRI methods are highly susceptible to respiratory and cardiac motion artifacts resulting in potentially inaccurate and misleading data. We have developed an initial preclinical, 7.0 T MRI implementation of the highly novel Magnetic Resonance Fingerprinting (MRF) methodology that has been previously described for clinical imaging applications. The MRF technology combines a priori variation in the MRI acquisition parameters with dictionary-based matching of acquired signal evolution profiles to simultaneously generate quantitative maps of T1 and T2 relaxation times and proton density. This preclinical MRF acquisition was constructed from a Fast Imaging with Steady-state Free Precession (FISP) MRI pulse sequence to acquire 600 MRF images with both evolving T1 and T2 weighting in approximately 30 minutes. This initial high field preclinical MRF investigation demonstrated reproducible and differentiated estimates of in vitro phantoms with different relaxation times. In vivo preclinical MRF results in mouse kidneys and brain tumor models demonstrated an inherent resistance to respiratory motion artifacts as well as sensitivity to known pathology. These results suggest that MRF methodology may offer the opportunity for quantification of numerous MRI parameters for a wide variety of preclinical imaging applications. PMID:25639694

  1. A Cytomorphic Chip for Quantitative Modeling of Fundamental Bio-Molecular Circuits.

    Science.gov (United States)

    2015-08-01

    We describe a 0.35 μm BiCMOS silicon chip that quantitatively models fundamental molecular circuits via efficient log-domain cytomorphic transistor equivalents. These circuits include those for biochemical binding with automatic representation of non-modular and loading behavior, e.g., in cascade and fan-out topologies; for representing variable Hill-coefficient operation and cooperative binding; for representing inducer, transcription-factor, and DNA binding; for probabilistic gene transcription with analogic representations of log-linear and saturating operation; for gain, degradation, and dynamics of mRNA and protein variables in transcription and translation; and, for faithfully representing biological noise via tunable stochastic transistor circuits. The use of on-chip DACs and ADCs enables multiple chips to interact via incoming and outgoing molecular digital data packets and thus create scalable biochemical reaction networks. The use of off-chip digital processors and on-chip digital memory enables programmable connectivity and parameter storage. We show that published static and dynamic MATLAB models of synthetic biological circuits including repressilators, feed-forward loops, and feedback oscillators are in excellent quantitative agreement with those from transistor circuits on the chip. Computationally intensive stochastic Gillespie simulations of molecular production are also rapidly reproduced by the chip and can be reliably tuned over the range of signal-to-noise ratios observed in biological cells.

  2. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  3. The relationship of document and quantitative literacy with learning styles and selected personal variables for aerospace technology students at Indiana State University

    Science.gov (United States)

    Martin, Royce Ann

    The purpose of this study was to determine the extent that student scores on a researcher-constructed quantitative and document literacy test, the Aviation Documents Delineator (ADD), were associated with (a) learning styles (imaginative, analytic, common sense, dynamic, and undetermined), as identified by the Learning Type Measure, (b) program curriculum (aerospace administration, professional pilot, both aerospace administration and professional pilot, other, or undeclared), (c) overall cumulative grade point average at Indiana State University, and (d) year in school (freshman, sophomore, junior, or senior). The Aviation Documents Delineator (ADD) was a three-part, 35 question survey that required students to interpret graphs, tables, and maps. Tasks assessed in the ADD included (a) locating, interpreting, and describing specific data displayed in the document, (b) determining data for a specified point on the table through interpolation, (c) comparing data for a string of variables representing one aspect of aircraft performance to another string of variables representing a different aspect of aircraft performance, (d) interpreting the documents to make decisions regarding emergency situations, and (e) performing single and/or sequential mathematical operations on a specified set of data. The Learning Type Measure (LTM) was a 15 item self-report survey developed by Bernice McCarthy (1995) to profile an individual's processing and perception tendencies in order to reveal different individual approaches to learning. The sample used in this study included 143 students enrolled in Aerospace Technology Department courses at Indiana State University in the fall of 1996. The ADD and the LTM were administered to each subject. Data collected in this investigation were analyzed using a stepwise multiple regression analysis technique. Results of the study revealed that the variables, year in school and GPA, were significant predictors of the criterion variables, document

  4. Space-Time Uncertainty and Cosmology: a Proposed Quantum Model of the Universe [ 245Kb

    Directory of Open Access Journals (Sweden)

    Tosto S.

    2013-10-01

    Full Text Available The paper introduces a cosmological model of the quantum universe. The aim of the model is (i to identify the possible mechanism that governs the matter/antimatter ratio existing in the universe and concurrently to propose (ii a reasonable growth mechanism of the universe and (iii a possible explanation of the dark energy. The concept of timespace uncertainty, on which is based the present quantum approach, has been proven able to bridge quantum mechanics and relativity.

  5. Quantitative Circulatory Physiology: an integrative mathematical model of human physiology for medical education.

    Science.gov (United States)

    Abram, Sean R; Hodnett, Benjamin L; Summers, Richard L; Coleman, Thomas G; Hester, Robert L

    2007-06-01

    We have developed Quantitative Circulatory Physiology (QCP), a mathematical model of integrative human physiology containing over 4,000 variables of biological interactions. This model provides a teaching environment that mimics clinical problems encountered in the practice of medicine. The model structure is based on documented physiological responses within peer-reviewed literature and serves as a dynamic compendium of physiological knowledge. The model is solved using a desktop, Windows-based program, allowing students to calculate time-dependent solutions and interactively alter over 750 parameters that modify physiological function. The model can be used to understand proposed mechanisms of physiological function and the interactions among physiological variables that may not be otherwise intuitively evident. In addition to open-ended or unstructured simulations, we have developed 30 physiological simulations, including heart failure, anemia, diabetes, and hemorrhage. Additional stimulations include 29 patients in which students are challenged to diagnose the pathophysiology based on their understanding of integrative physiology. In summary, QCP allows students to examine, integrate, and understand a host of physiological factors without causing harm to patients. This model is available as a free download for Windows computers at http://physiology.umc.edu/themodelingworkshop.

  6. Plutonium chemistry: a synthesis of experimental data and a quantitative model for plutonium oxide solubility

    International Nuclear Information System (INIS)

    Haschke, J.M.; Oversby, V.M.

    2002-01-01

    The chemistry of plutonium is important for assessing potential behavior of radioactive waste under conditions of geologic disposal. This paper reviews experimental data on dissolution of plutonium oxide solids, describes a hybrid kinetic-equilibrium model for predicting steady-state Pu concentrations, and compares laboratory results with predicted Pu concentrations and oxidation-state distributions. The model is based on oxidation of PuO 2 by water to produce PuO 2+x , an oxide that can release Pu(V) to solution. Kinetic relationships between formation of PuO 2+x , dissolution of Pu(V), disproportionation of Pu(V) to Pu(IV) and Pu(VI), and reduction of Pu(VI) are given and used in model calculations. Data from tests of pyrochemical salt wastes in brines are discussed and interpreted using the conceptual model. Essential data for quantitative modeling at conditions relevant to nuclear waste repositories are identified and laboratory experiments to determine rate constants for use in the model are discussed

  7. A pulsatile flow model for in vitro quantitative evaluation of prosthetic valve regurgitation

    Directory of Open Access Journals (Sweden)

    S. Giuliatti

    2000-03-01

    Full Text Available A pulsatile pressure-flow model was developed for in vitro quantitative color Doppler flow mapping studies of valvular regurgitation. The flow through the system was generated by a piston which was driven by stepper motors controlled by a computer. The piston was connected to acrylic chambers designed to simulate "ventricular" and "atrial" heart chambers. Inside the "ventricular" chamber, a prosthetic heart valve was placed at the inflow connection with the "atrial" chamber while another prosthetic valve was positioned at the outflow connection with flexible tubes, elastic balloons and a reservoir arranged to mimic the peripheral circulation. The flow model was filled with a 0.25% corn starch/water suspension to improve Doppler imaging. A continuous flow pump transferred the liquid from the peripheral reservoir to another one connected to the "atrial" chamber. The dimensions of the flow model were designed to permit adequate imaging by Doppler echocardiography. Acoustic windows allowed placement of transducers distal and perpendicular to the valves, so that the ultrasound beam could be positioned parallel to the valvular flow. Strain-gauge and electromagnetic transducers were used for measurements of pressure and flow in different segments of the system. The flow model was also designed to fit different sizes and types of prosthetic valves. This pulsatile flow model was able to generate pressure and flow in the physiological human range, with independent adjustment of pulse duration and rate as well as of stroke volume. This model mimics flow profiles observed in patients with regurgitant prosthetic valves.

  8. Quantitative models for predicting adsorption of oxytetracycline, ciprofloxacin and sulfamerazine to swine manures with contrasting properties.

    Science.gov (United States)

    Cheng, Dengmiao; Feng, Yao; Liu, Yuanwang; Li, Jinpeng; Xue, Jianming; Li, Zhaojun

    2018-09-01

    Understanding antibiotic adsorption in livestock manures is crucial to assess the fate and risk of antibiotics in the environment. In this study, three quantitative models developed with swine manure-water distribution coefficients (LgK d ) for oxytetracycline (OTC), ciprofloxacin (CIP) and sulfamerazine (SM1) in swine manures. Physicochemical parameters (n=12) of the swine manure were used as independent variables using partial least-squares (PLSs) analysis. The cumulative cross-validated regression coefficients (Q 2 cum ) values, standard deviations (SDs) and external validation coefficient (Q 2 ext ) ranged from 0.761 to 0.868, 0.027 to 0.064, and 0.743 to 0.827 for the three models; as such, internal and external predictability of the models were strong. The pH, soluble organic carbon (SOC) and nitrogen (SON), and Ca were important explanatory variables for the OTC-Model, pH, SOC, and SON for the CIP-model, and pH, total organic nitrogen (TON), and SOC for the SM1-model. The high VIPs (variable importance in the projections) of pH (1.178-1.396), SOC (0.968-1.034), and SON (0.822 and 0.865) established these physicochemical parameters as likely being dominant (associatively) in affecting transport of antibiotics in swine manures. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Modelling and Quantitative Analysis of LTRACK–A Novel Mobility Management Algorithm

    Directory of Open Access Journals (Sweden)

    Benedek Kovács

    2006-01-01

    Full Text Available This paper discusses the improvements and parameter optimization issues of LTRACK, a recently proposed mobility management algorithm. Mathematical modelling of the algorithm and the behavior of the Mobile Node (MN are used to optimize the parameters of LTRACK. A numerical method is given to determine the optimal values of the parameters. Markov chains are used to model both the base algorithm and the so-called loop removal effect. An extended qualitative and quantitative analysis is carried out to compare LTRACK to existing handover mechanisms such as MIP, Hierarchical Mobile IP (HMIP, Dynamic Hierarchical Mobility Management Strategy (DHMIP, Telecommunication Enhanced Mobile IP (TeleMIP, Cellular IP (CIP and HAWAII. LTRACK is sensitive to network topology and MN behavior so MN movement modelling is also introduced and discussed with different topologies. The techniques presented here can not only be used to model the LTRACK algorithm, but other algorithms too. There are many discussions and calculations to support our mathematical model to prove that it is adequate in many cases. The model is valid on various network levels, scalable vertically in the ISO-OSI layers and also scales well with the number of network elements.

  10. [Quantitative models between canopy hyperspectrum and its component features at apple tree prosperous fruit stage].

    Science.gov (United States)

    Wang, Ling; Zhao, Geng-xing; Zhu, Xi-cun; Lei, Tong; Dong, Fang

    2010-10-01

    Hyperspectral technique has become the basis of quantitative remote sensing. Hyperspectrum of apple tree canopy at prosperous fruit stage consists of the complex information of fruits, leaves, stocks, soil and reflecting films, which was mostly affected by component features of canopy at this stage. First, the hyperspectrum of 18 sample apple trees with reflecting films was compared with that of 44 trees without reflecting films. It could be seen that the impact of reflecting films on reflectance was obvious, so the sample trees with ground reflecting films should be separated to analyze from those without ground films. Secondly, nine indexes of canopy components were built based on classified digital photos of 44 apple trees without ground films. Thirdly, the correlation between the nine indexes and canopy reflectance including some kinds of conversion data was analyzed. The results showed that the correlation between reflectance and the ratio of fruit to leaf was the best, among which the max coefficient reached 0.815, and the correlation between reflectance and the ratio of leaf was a little better than that between reflectance and the density of fruit. Then models of correlation analysis, linear regression, BP neural network and support vector regression were taken to explain the quantitative relationship between the hyperspectral reflectance and the ratio of fruit to leaf with the softwares of DPS and LIBSVM. It was feasible that all of the four models in 611-680 nm characteristic band are feasible to be used to predict, while the model accuracy of BP neural network and support vector regression was better than one-variable linear regression and multi-variable regression, and the accuracy of support vector regression model was the best. This study will be served as a reliable theoretical reference for the yield estimation of apples based on remote sensing data.

  11. Use of a plant level logic model for quantitative assessment of systems interactions

    International Nuclear Information System (INIS)

    Chu, B.B.; Rees, D.C.; Kripps, L.P.; Hunt, R.N.; Bradley, M.

    1985-01-01

    The Electric Power Research Institute (EPRI) has sponsored a research program to investigate methods for identifying systems interactions (SIs) and for the evaluation of their importance. Phase 1 of the EPRI research project focused on the evaluation of methods for identification of SIs. Major results of the Phase 1 activities are the documentation of four different methodologies for identification of potential SIs and development of guidelines for performing an effective plant walkdown in support of an SI analysis. Phase II of the project, currently being performed, is utilizing a plant level logic model of a pressurized water reactor (PWR) to determine the quantitative importance of identified SIs. In Phase II, previously reported events involving interactions between systems were screened and selected on the basis of their relevance to the Baltimore Gas and Electric (BGandE) Calvert Cliffs Nuclear Power Plant design and perceived potential safety significance. Selected events were then incorporated into the BGandE plant level GO logic model. The model is being exercised to calculate the relative importance of these events. Five previously identified event scenarios, extracted from licensee event reports (LERs) are being evaluated during the course of the study. A key feature of the approach being used in Phase II is the use of a logic model in a manner to effectively evaluate the impact of events on the system level and the plant level for the mitigation of transients. Preliminary study results indicate that the developed methodology can be a viable and effective means for determining the quantitative significance of SIs

  12. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  13. A quantitative quasispecies theory-based model of virus escape mutation under immune selection.

    Science.gov (United States)

    Woo, Hyung-June; Reifman, Jaques

    2012-08-07

    Viral infections involve a complex interplay of the immune response and escape mutation of the virus quasispecies inside a single host. Although fundamental aspects of such a balance of mutation and selection pressure have been established by the quasispecies theory decades ago, its implications have largely remained qualitative. Here, we present a quantitative approach to model the virus evolution under cytotoxic T-lymphocyte immune response. The virus quasispecies dynamics are explicitly represented by mutations in the combined sequence space of a set of epitopes within the viral genome. We stochastically simulated the growth of a viral population originating from a single wild-type founder virus and its recognition and clearance by the immune response, as well as the expansion of its genetic diversity. Applied to the immune escape of a simian immunodeficiency virus epitope, model predictions were quantitatively comparable to the experimental data. Within the model parameter space, we found two qualitatively different regimes of infectious disease pathogenesis, each representing alternative fates of the immune response: It can clear the infection in finite time or eventually be overwhelmed by viral growth and escape mutation. The latter regime exhibits the characteristic disease progression pattern of human immunodeficiency virus, while the former is bounded by maximum mutation rates that can be suppressed by the immune response. Our results demonstrate that, by explicitly representing epitope mutations and thus providing a genotype-phenotype map, the quasispecies theory can form the basis of a detailed sequence-specific model of real-world viral pathogens evolving under immune selection.

  14. Process-Model Feminism in the Corporate University

    Science.gov (United States)

    Spitzer-Hanks, D. T.

    2016-01-01

    In a period characterised by worries over the rise of the corporate university, it is important to ask what role feminism plays in the academy, and whether that role is commensurate with feminist values and ethics. Commercial and political pressures brought to bear on the encounter between instructor and student can rob teaching of its efficacy,…

  15. Modelling Graduate Skill Transfer from University to the Workplace

    Science.gov (United States)

    Jackson, Denise

    2016-01-01

    This study explores skill transfer in graduates as they transition from university to the workplace. Graduate employability continues to dominate higher education agendas yet the transfer of acquired skills is often assumed. The study is prompted by documented concern with graduate performance in certain employability skills, and prevalent skill…

  16. Towards a Theory of University Entrepreneurship: Developing a Theoretical Model

    Science.gov (United States)

    Woollard, David

    2010-01-01

    This paper sets out to develop a robust theory in a largely atheoretical field of study. The increasing importance of entrepreneurship in delivering the "Third Mission" calls for an enhanced understanding of the university entrepreneurship phenomenon, not solely as a subject of academic interest but also to guide the work of practitioners in the…

  17. Modeling, Identification and Control at Telemark University College

    Directory of Open Access Journals (Sweden)

    Bernt Lie

    2009-07-01

    Full Text Available Master studies in process automation started in 1989 at what soon became Telemark University College, and the 20 year anniversary marks the start of our own PhD degree in Process, Energy and Automation Engineering. The paper gives an overview of research activities related to control engineering at Department of Electrical Engineering, Information Technology and Cybernetics.

  18. Reconstructing an interacting holographic polytropic gas model in a non-flat FRW universe

    International Nuclear Information System (INIS)

    Karami, K; Abdolmaleki, A

    2010-01-01

    We study the correspondence between the interacting holographic dark energy and the polytropic gas model of dark energy in a non-flat FRW universe. This correspondence allows one to reconstruct the potential and the dynamics for the scalar field of the polytropic model, which describe accelerated expansion of the universe.

  19. Reconstructing an interacting holographic polytropic gas model in a non-flat FRW universe

    Energy Technology Data Exchange (ETDEWEB)

    Karami, K; Abdolmaleki, A, E-mail: KKarami@uok.ac.i [Department of Physics, University of Kurdistan, Pasdaran Street, Sanandaj (Iran, Islamic Republic of)

    2010-05-01

    We study the correspondence between the interacting holographic dark energy and the polytropic gas model of dark energy in a non-flat FRW universe. This correspondence allows one to reconstruct the potential and the dynamics for the scalar field of the polytropic model, which describe accelerated expansion of the universe.

  20. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  1. Education for sustainability: A new challenge for the current university model

    Directory of Open Access Journals (Sweden)

    Ana Fernández Pérez

    2018-01-01

    Full Text Available Education for Sustainable Development aims to disseminate and promote a set of principles and values within the university model through management, teaching, research and university extension. It does not focus on a specific area but covers many areas such as equality, peace, health, sustainable urbanization, the environment. The objective of this study is to make an appeal in all these areas so that universities incorporate the dimension of sustainability in their curricula, through teaching, research and university management. For this, the different international and regional initiatives that have emphasized the need for Universities to be committed to the culture of sustainability and their inclusion in the current university model have been analyzed. The work will conclude with the idea that a sustainable development is perhaps one of the key pieces in the conception of the University of the 21st century.

  2. Need for collection of quantitative distribution data for dosimetry and metabolic modeling

    International Nuclear Information System (INIS)

    Lathrop, K.A.

    1976-01-01

    Problems in radiation dose distribution studies in humans are discussed. Data show the effective half-times for 7 Be and 75 Se in the mouse, rat, monkey, dog, and human show no correlation with weight, body surface, or other readily apparent factor that could be used to equate nonhuman and human data. Another problem sometimes encountered in attempting to extrapolate animal data to humans involves equivalent doses of the radiopharmaceutical. A usual human dose for a radiopharmaceutical is 1 ml or 0.017 mg/kg. The same solution injected into a mouse in a convenient volume of 0.1 ml results in a dose of 4 ml/kg or 240 times that received by the human. The effect on whole body retention produced by a dose difference of similar magnitude for selenium in the rat shows the retention is at least twice as great with the smaller amount. With the development of methods for the collection of data throughout the body representing the fractional distribution of radioactivity versus time, not only can more realistic dose estimates be made, but also the tools will be provided for the study of physiological and biochemical interrelationships in the intact subject from which compartmental models may be made which have diagnostic significance. The unique requirement for quantitative biologic data needed for calculation of radiation absorbed doses is the same as the unique scientific contribution that nuclear medicine can make, which is the quantitative in vivo study of physiologic and biochemical processes. The technique involved is not the same as quantitation of a radionuclide image, but is a step beyond

  3. General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.

    Science.gov (United States)

    de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael

    2016-11-01

    Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.

  4. A generalized quantitative antibody homeostasis model: maintenance of global antibody equilibrium by effector functions.

    Science.gov (United States)

    Prechl, József

    2017-11-01

    The homeostasis of antibodies can be characterized as a balanced production, target-binding and receptor-mediated elimination regulated by an interaction network, which controls B-cell development and selection. Recently, we proposed a quantitative model to describe how the concentration and affinity of interacting partners generates a network. Here we argue that this physical, quantitative approach can be extended for the interpretation of effector functions of antibodies. We define global antibody equilibrium as the zone of molar equivalence of free antibody, free antigen and immune complex concentrations and of dissociation constant of apparent affinity: [Ab]=[Ag]=[AbAg]= K D . This zone corresponds to the biologically relevant K D range of reversible interactions. We show that thermodynamic and kinetic properties of antibody-antigen interactions correlate with immunological functions. The formation of stable, long-lived immune complexes correspond to a decrease of entropy and is a prerequisite for the generation of higher-order complexes. As the energy of formation of complexes increases, we observe a gradual shift from silent clearance to inflammatory reactions. These rules can also be applied to complement activation-related immune effector processes, linking the physicochemical principles of innate and adaptive humoral responses. Affinity of the receptors mediating effector functions shows a wide range of affinities, allowing the continuous sampling of antibody-bound antigen over the complete range of concentrations. The generation of multivalent, multicomponent complexes triggers effector functions by crosslinking these receptors on effector cells with increasing enzymatic degradation potential. Thus, antibody homeostasis is a thermodynamic system with complex network properties, nested into the host organism by proper immunoregulatory and effector pathways. Maintenance of global antibody equilibrium is achieved by innate qualitative signals modulating a

  5. The Arizona Universities Library Consortium patron-driven e-book model

    Directory of Open Access Journals (Sweden)

    Jeanne Richardson

    2013-03-01

    Full Text Available Building on Arizona State University's patron-driven acquisitions (PDA initiative in 2009, the Arizona Universities Library Consortium, in partnership with the Ingram Content Group, created a cooperative patron-driven model to acquire electronic books (e-books. The model provides the opportunity for faculty and students at the universities governed by the Arizona Board of Regents (ABOR to access a core of e-books made accessible through resource discovery services and online catalogs. These books are available for significantly less than a single ABOR university would expend for the same materials. The patron-driven model described is one of many evolving models in digital scholarship, and, although the Arizona Universities Library Consortium reports a successful experience, patron-driven models pose questions to stakeholders in the academic publishing industry.

  6. Non Linear Programming (NLP) formulation for quantitative modeling of protein signal transduction pathways.

    Science.gov (United States)

    Mitsos, Alexander; Melas, Ioannis N; Morris, Melody K; Saez-Rodriguez, Julio; Lauffenburger, Douglas A; Alexopoulos, Leonidas G

    2012-01-01

    Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i) excessive CPU time requirements and ii) loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP) formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  7. Non Linear Programming (NLP formulation for quantitative modeling of protein signal transduction pathways.

    Directory of Open Access Journals (Sweden)

    Alexander Mitsos

    Full Text Available Modeling of signal transduction pathways plays a major role in understanding cells' function and predicting cellular response. Mathematical formalisms based on a logic formalism are relatively simple but can describe how signals propagate from one protein to the next and have led to the construction of models that simulate the cells response to environmental or other perturbations. Constrained fuzzy logic was recently introduced to train models to cell specific data to result in quantitative pathway models of the specific cellular behavior. There are two major issues in this pathway optimization: i excessive CPU time requirements and ii loosely constrained optimization problem due to lack of data with respect to large signaling pathways. Herein, we address both issues: the former by reformulating the pathway optimization as a regular nonlinear optimization problem; and the latter by enhanced algorithms to pre/post-process the signaling network to remove parts that cannot be identified given the experimental conditions. As a case study, we tackle the construction of cell type specific pathways in normal and transformed hepatocytes using medium and large-scale functional phosphoproteomic datasets. The proposed Non Linear Programming (NLP formulation allows for fast optimization of signaling topologies by combining the versatile nature of logic modeling with state of the art optimization algorithms.

  8. Quantitative model of super-Arrhenian behavior in glass forming materials

    Science.gov (United States)

    Caruthers, J. M.; Medvedev, G. A.

    2018-05-01

    The key feature of glass forming liquids is the super-Arrhenian temperature dependence of the mobility, where the mobility can increase by ten orders of magnitude or more as the temperature is decreased if crystallization does not intervene. A fundamental description of the super-Arrhenian behavior has been developed; specifically, the logarithm of the relaxation time is a linear function of 1 /U¯x , where U¯x is the independently determined excess molar internal energy and B is a material constant. This one-parameter mobility model quantitatively describes data for 21 glass forming materials, which are all the materials where there are sufficient experimental data for analysis. The effect of pressure on the loga mobility is also described using the same U¯x(T ,p ) function determined from the difference between the liquid and crystalline internal energies. It is also shown that B is well correlated with the heat of fusion. The prediction of the B /U¯x model is compared to the Adam and Gibbs 1 /T S¯x model, where the B /U¯x model is significantly better in unifying the full complement of mobility data. The implications of the B /U¯x model for the development of a fundamental description of glass are discussed.

  9. Multivariate characterisation and quantitative structure-property relationship modelling of nitroaromatic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Joensson, S. [Man-Technology-Environment Research Centre, Department of Natural Sciences, Orebro University, 701 82 Orebro (Sweden)], E-mail: sofie.jonsson@nat.oru.se; Eriksson, L.A. [Department of Natural Sciences and Orebro Life Science Center, Orebro University, 701 82 Orebro (Sweden); Bavel, B. van [Man-Technology-Environment Research Centre, Department of Natural Sciences, Orebro University, 701 82 Orebro (Sweden)

    2008-07-28

    A multivariate model to characterise nitroaromatics and related compounds based on molecular descriptors was calculated. Descriptors were collected from literature and through empirical, semi-empirical and density functional theory-based calculations. Principal components were used to describe the distribution of the compounds in a multidimensional space. Four components described 76% of the variation in the dataset. PC1 separated the compounds due to molecular weight, PC2 separated the different isomers, PC3 arranged the compounds according to different functional groups such as nitrobenzoic acids, nitrobenzenes, nitrotoluenes and nitroesters and PC4 differentiated the compounds containing chlorine from other compounds. Quantitative structure-property relationship models were calculated using partial least squares (PLS) projection to latent structures to predict gas chromatographic (GC) retention times and the distribution between the water phase and air using solid-phase microextraction (SPME). GC retention time was found to be dependent on the presence of polar amine groups, electronic descriptors including highest occupied molecular orbital, dipole moments and the melting point. The model of GC retention time was good, but the precision was not precise enough for practical use. An important environmental parameter was measured using SPME, the distribution between headspace (air) and the water phase. This parameter was mainly dependent on Henry's law constant, vapour pressure, log P, content of hydroxyl groups and atmospheric OH rate constant. The predictive capacity of the model substantially improved when recalculating a model using these five descriptors only.

  10. Combining quantitative trait loci analysis with physiological models to predict genotype-specific transpiration rates.

    Science.gov (United States)

    Reuning, Gretchen A; Bauerle, William L; Mullen, Jack L; McKay, John K

    2015-04-01

    Transpiration is controlled by evaporative demand and stomatal conductance (gs ), and there can be substantial genetic variation in gs . A key parameter in empirical models of transpiration is minimum stomatal conductance (g0 ), a trait that can be measured and has a large effect on gs and transpiration. In Arabidopsis thaliana, g0 exhibits both environmental and genetic variation, and quantitative trait loci (QTL) have been mapped. We used this information to create a genetically parameterized empirical model to predict transpiration of genotypes. For the parental lines, this worked well. However, in a recombinant inbred population, the predictions proved less accurate. When based only upon their genotype at a single g0 QTL, genotypes were less distinct than our model predicted. Follow-up experiments indicated that both genotype by environment interaction and a polygenic inheritance complicate the application of genetic effects into physiological models. The use of ecophysiological or 'crop' models for predicting transpiration of novel genetic lines will benefit from incorporating further knowledge of the genetic control and degree of independence of core traits/parameters underlying gs variation. © 2014 John Wiley & Sons Ltd.

  11. A Possible Universe in Pulsation by Using a Hydro-Dynamical Model for Gravity

    Directory of Open Access Journals (Sweden)

    Corneliu BERBENTE

    2016-12-01

    Full Text Available By using a hydro-dynamical model for gravity previously given by the author, a pulsating universe is possible to describe. This is possible because two hydro-dynamical sources are in attraction both when they are emitting and absorbing fluid. In our model, bodies (matter and energy are interacting via an incompressible fluid made of gravitons (photon-like particles having a wave length of the order of magnitude of the radius of universe. One considers the universe uniform at large scale, the effects of general relativity type being local and negligible at global scale. An “elastic sphere” model for the universe is suggested to describe the possible inversion. The expansion of the universe stops when the “elastic energy” overcomes the kinetic one; this takes place near the point of maximal emission speed of the fluid of gravitons. The differential equation for the universe in expansion is adapted to contraction. Analytical solutions are given.

  12. A kinetic-based sigmoidal model for the polymerase chain reaction and its application to high-capacity absolute quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Stewart Don

    2008-05-01

    Full Text Available Abstract Background Based upon defining a common reference point, current real-time quantitative PCR technologies compare relative differences in amplification profile position. As such, absolute quantification requires construction of target-specific standard curves that are highly resource intensive and prone to introducing quantitative errors. Sigmoidal modeling using nonlinear regression has previously demonstrated that absolute quantification can be accomplished without standard curves; however, quantitative errors caused by distortions within the plateau phase have impeded effective implementation of this alternative approach. Results Recognition that amplification rate is linearly correlated to amplicon quantity led to the derivation of two sigmoid functions that allow target quantification via linear regression analysis. In addition to circumventing quantitative errors produced by plateau distortions, this approach allows the amplification efficiency within individual amplification reactions to be determined. Absolute quantification is accomplished by first converting individual fluorescence readings into target quantity expressed in fluorescence units, followed by conversion into the number of target molecules via optical calibration. Founded upon expressing reaction fluorescence in relation to amplicon DNA mass, a seminal element of this study was to implement optical calibration using lambda gDNA as a universal quantitative standard. Not only does this eliminate the need to prepare target-specific quantitative standards, it relegates establishment of quantitative scale to a single, highly defined entity. The quantitative competency of this approach was assessed by exploiting "limiting dilution assay" for absolute quantification, which provided an independent gold standard from which to verify quantitative accuracy. This yielded substantive corroborating evidence that absolute accuracies of ± 25% can be routinely achieved. Comparison

  13. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  14. Hypermedia for language learning: the FREE model at Coventry University

    Directory of Open Access Journals (Sweden)

    Marina Orsini-Jones

    1996-12-01

    Full Text Available The tradition of incorporating CALL into the language-learning curriculum goes back to the early 1980s at Coventry University, and since then has evolved in keeping with changes in the technology available (Corness 1984; Benwell 1986; Orsini-Jones 1987; Corness et al 1992; Orsini-Jones 1993. Coventry University is at present pioneering the integration of hypermedia into the curriculum for the teaching of Italian language and society. The syllabus for a complete module of the BA Modern Languages and BA European Studies Degrees, which will count as l/8th of the students' programme for year 2, has been designed upon in-house produced hypermedia courseware.

  15. A Quantitative, Time-Dependent Model of Oxygen Isotopes in the Solar Nebula: Step one

    Science.gov (United States)

    Nuth, J. A.; Paquette, J. A.; Farquhar, A.; Johnson, N. M.

    2011-01-01

    The remarkable discovery that oxygen isotopes in primitive meteorites were fractionated along a line of slope I rather than along the typical slope 0,52 terrestrial fractionation line occurred almost 40 years ago, However, a satisfactory, quantitative explanation for this observation has yet to be found, though many different explanations have been proposed, The first of these explanations proposed that the observed line represented the final product produced by mixing molecular cloud dust with a nucleosynthetic component, rich in O-16, possibly resulting from a nearby supernova explosion, Donald Clayton suggested that Galactic Chemical Evolution would gradually change the oxygen isotopic composition of the interstellar grain population by steadily producing O-16 in supernovae, then producing the heavier isotopes as secondary products in lower mass stars, Thiemens and collaborators proposed a chemical mechanism that relied on the availability of additional active rotational and vibrational states in otherwise-symmetric molecules, such as CO2, O3 or SiO2, containing two different oxygen isotopes and a second, photochemical process that suggested that differential photochemical dissociation processes could fractionate oxygen , This second line of research has been pursued by several groups, though none of the current models is quantitative,

  16. Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers

    Science.gov (United States)

    Kowalski, Benjamin Andrew

    Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.

  17. Stochastic modeling and mathematical statistics a text for statisticians and quantitative scientists

    CERN Document Server

    Samaniego, Francisco J

    2014-01-01

    ""Stochastic Modeling and Mathematical Statistics is a new and welcome addition to the corpus of undergraduate statistical textbooks in the market. The singular thing that struck me when I initially perused the book was its lucid and endearing conversational tone, which pervades the entire text. It radiated warmth. … In my course at the University of Michigan, I rely primarily on my own lecture notes and have used Rice as supplementary material. Having gone through this text, I am strongly inclined to add this to the supplementary list as well. I have little doubt that this book will be very s

  18. A Tuned Value Chain Model for University Based Public Research Organisation. Case Lut Cst.

    Directory of Open Access Journals (Sweden)

    Vesa Karvonen

    2012-12-01

    Full Text Available The Porter´s value chain model was introduced for strategic business purposes. During the last decades also Universities and University based institutes have started to use actions similar to private business concepts. A University based institute is not independent actor like company but there are interest groups who are expecting them to act like they would be. This article discusses about the possibility of utilize tuned value chain to public research organizations (PRO. Also the interactions of tuned value chain model to existing industrial network are discussed. The case study object is the Centre for Separation Technology (CST at Lappeenranta University of Technology (LUT in Finland.

  19. Carolina Care at University of North Carolina Health Care: Implementing a Theory-Driven Care Delivery Model Across a Healthcare System.

    Science.gov (United States)

    Tonges, Mary; Ray, Joel D; Herman, Suzanne; McCann, Meghan

    2018-04-01

    Patient satisfaction is a key component of healthcare organizations' performance. Providing a consistent, positive patient experience across a system can be challenging. This article describes an organization's approach to achieving this goal by implementing a successful model developed at the flagship academic healthcare center across an 8-hospital system. The Carolina Care at University of North Carolina Health Care initiative has resulted in substantive qualitative and quantitative benefits including higher patient experience scores for both overall rating and nurse communication.

  20. How predictive quantitative modelling of tissue organisation can inform liver disease pathogenesis.

    Science.gov (United States)

    Drasdo, Dirk; Hoehme, Stefan; Hengstler, Jan G

    2014-10-01

    From the more than 100 liver diseases described, many of those with high incidence rates manifest themselves by histopathological changes, such as hepatitis, alcoholic liver disease, fatty liver disease, fibrosis, and, in its later stages, cirrhosis, hepatocellular carcinoma, primary biliary cirrhosis and other disorders. Studies of disease pathogeneses are largely based on integrating -omics data pooled from cells at different locations with spatial information from stained liver structures in animal models. Even though this has led to significant insights, the complexity of interactions as well as the involvement of processes at many different time and length scales constrains the possibility to condense disease processes in illustrations, schemes and tables. The combination of modern imaging modalities with image processing and analysis, and mathematical models opens up a promising new approach towards a quantitative understanding of pathologies and of disease processes. This strategy is discussed for two examples, ammonia metabolism after drug-induced acute liver damage, and the recovery of liver mass as well as architecture during the subsequent regeneration process. This interdisciplinary approach permits integration of biological mechanisms and models of processes contributing to disease progression at various scales into mathematical models. These can be used to perform in silico simulations to promote unravelling the relation between architecture and function as below illustrated for liver regeneration, and bridging from the in vitro situation and animal models to humans. In the near future novel mechanisms will usually not be directly elucidated by modelling. However, models will falsify hypotheses and guide towards the most informative experimental design. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  1. A quantitative model for estimating mean annual soil loss in cultivated land using 137Cs measurements

    International Nuclear Information System (INIS)

    Yang Hao; Zhao Qiguo; Du Mingyuan; Minami, Katsuyuki; Hatta, Tamao

    2000-01-01

    The radioisotope 137 Cs has been widely used to determine rates of cultivated soil loss, Many calibration relationships (including both empirical relationships and theoretical models) have been employed to estimate erosion rates from the amount of 137 Cs lost from the cultivated soil profile. However, there are important limitations which restrict the reliability of these models, which consider only the uniform distribution of 137 Cs in the plough layer and the depth. As a result, erosion rates they may be overestimated or underestimated. This article presents a quantitative model for the relation the amount of 137 Cs lost from the cultivate soil profile and the rate of soil erosion. According to a mass balance model, during the construction of this model we considered the following parameters: the remaining fraction of the surface enrichment layer (F R ), the thickness of the surface enrichment layer (H s ), the depth of the plough layer (H p ), input fraction of the total 137 Cs fallout deposition during a given year t (F t ), radioactive decay of 137 Cs (k), and sampling year (t). The simulation results showed that the amounts of erosion rates estimated using this model were very sensitive to changes in the values of the parameters F R , H s , and H p . We also observed that the relationship between the rate of soil loss and 137 Cs depletion is neither linear nor logarithmic, and is very complex. Although the model is an improvement over existing approaches to derive calibration relationships for cultivated soil, it requires empirical information on local soil properties and the behavior of 137 Cs in the soil profile. There is clearly still a need for more precise information on the latter aspect and, in particular, on the retention of 137 Cs fallout in the top few millimeters of the soil profile and on the enrichment and depletion effects associated with soil redistribution (i.e. for determining accurate values of F R and H s ). (author)

  2. Predictive models for suicidal thoughts and behaviors among Spanish University students: rationale and methods of the UNIVERSAL (University & mental health) project.

    Science.gov (United States)

    Blasco, Maria Jesús; Castellví, Pere; Almenara, José; Lagares, Carolina; Roca, Miquel; Sesé, Albert; Piqueras, José Antonio; Soto-Sanz, Victoria; Rodríguez-Marín, Jesús; Echeburúa, Enrique; Gabilondo, Andrea; Cebrià, Ana Isabel; Miranda-Mendizábal, Andrea; Vilagut, Gemma; Bruffaerts, Ronny; Auerbach, Randy P; Kessler, Ronald C; Alonso, Jordi

    2016-05-04

    Suicide is a leading cause of death among young people. While suicide prevention is considered a research and intervention priority, longitudinal data is needed to identify risk and protective factors associate with suicidal thoughts and behaviors. Here we describe the UNIVERSAL (University and Mental Health) project which aims are to: (1) test prevalence and 36-month incidence of suicidal thoughts and behaviors; and (2) identify relevant risk and protective factors associated with the incidence of suicidal thoughts and behaviors among university students in Spain. An ongoing multicenter, observational, prospective cohort study of first year university students in 5 Spanish universities. Students will be assessed annually during a 36 month follow-up. The surveys will be administered through an online, secure web-based platform. A clinical reappraisal will be completed among a subsample of respondents. Suicidal thoughts and behaviors will be assess with the Self-Injurious Thoughts and Behaviors Interview (SITBI) and the Columbia-Suicide Severity Rating Scale (C-SSRS). Risk and protective factors will include: mental disorders, measured with the Composite International Diagnostic Interview version 3.0 (CIDI 3.0) and Screening Scales (CIDI-SC), and the Epi-Q Screening Survey (EPI-Q-SS), socio-demographic variables, self-perceived health status, health behaviors, well-being, substance use disorders, service use and treatment. The UNIVERSAL project is part of the International College Surveys initiative, which is a core project within the World Mental Health consortium. Lifetime and the 12-month prevalence will be calculated for suicide ideation, plans and attempts. Cumulative incidence of suicidal thoughts and behaviors, and mental disorders will be measured using the actuarial method. Risk and protective factors of suicidal thoughts and behaviors will be analyzed by Cox proportional hazard models. The study will provide valid, innovative and useful data for developing

  3. Diffusion-weighted MRI and quantitative biophysical modeling of hippocampal neurite loss in chronic stress.

    Directory of Open Access Journals (Sweden)

    Peter Vestergaard-Poulsen

    Full Text Available Chronic stress has detrimental effects on physiology, learning and memory and is involved in the development of anxiety and depressive disorders. Besides changes in synaptic formation and neurogenesis, chronic stress also induces dendritic remodeling in the hippocampus, amygdala and the prefrontal cortex. Investigations of dendritic remodeling during development and treatment of stress are currently limited by the invasive nature of histological and stereological methods. Here we show that high field diffusion-weighted MRI combined with quantitative biophysical modeling of the hippocampal dendritic loss in 21 day restraint stressed rats highly correlates with former histological findings. Our study strongly indicates that diffusion-weighted MRI is sensitive to regional dendritic loss and thus a promising candidate for non-invasive studies of dendritic plasticity in chronic stress and stress-related disorders.

  4. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  5. The JBEI quantitative metabolic modeling library (jQMM): a python library for modeling microbial metabolism

    DEFF Research Database (Denmark)

    Birkel, Garrett W.; Ghosh, Amit; Kumar, Vinay S.

    2017-01-01

    analysis, new methods for the effective use of the ever more readily available and abundant -omics data (i.e. transcriptomics, proteomics and metabolomics) are urgently needed.Results: The jQMM library presented here provides an open-source, Python-based framework for modeling internal metabolic fluxes......, it introduces the capability to use C-13 labeling experimental data to constrain comprehensive genome-scale models through a technique called two-scale C-13 Metabolic Flux Analysis (2S-C-13 MFA). In addition, the library includes a demonstration of a method that uses proteomics data to produce actionable...... insights to increase biofuel production. Finally, the use of the jQMM library is illustrated through the addition of several Jupyter notebook demonstration files that enhance reproducibility and provide the capability to be adapted to the user's specific needs.Conclusions: jQMM will facilitate the design...

  6. QUANTITATIVE ESTIMATION OF SOIL EROSION IN THE DRĂGAN RIVER WATERSHED WITH THE U.S.L.E. TYPE ROMSEM MODEL

    Directory of Open Access Journals (Sweden)

    Csaba HORVÁTH

    2008-05-01

    Full Text Available Quantitative estimation of soil erosion in the Drăgan river watershed with the U.S.L.E. type Romsem modelSediment delivered from water erosion causes substantial waterway damages and water quality degradation. A number of factors such as drainage area size, basin slope, climate, land use/land cover may affect sediment delivery processes. The goal of this study is to define a computationally effective suitable soil erosion model in the Drăgan river watershed, for future sedimentation studies. Geographic Information System (GIS is used to determine the Universal Soil Loss Equation Model (U.S.L.E. values of the studied water basin. The methods and approaches used in this study are expected to be applicable in future research and to watersheds in other regions.

  7. Methods for quantitative measurement of tooth wear using the area and volume of virtual model cusps.

    Science.gov (United States)

    Kim, Soo-Hyun; Park, Young-Seok; Kim, Min-Kyoung; Kim, Sulhee; Lee, Seung-Pyo

    2018-04-01

    Clinicians must examine tooth wear to make a proper diagnosis. However, qualitative methods of measuring tooth wear have many disadvantages. Therefore, this study aimed to develop and evaluate quantitative parameters using the cusp area and volume of virtual dental models. The subjects of this study were the same virtual models that were used in our former study. The same age group classification and new tooth wear index (NTWI) scoring system were also reused. A virtual occlusal plane was generated with the highest cusp points and lowered vertically from 0.2 to 0.8 mm to create offset planes. The area and volume of each cusp was then measured and added together. In addition to the former analysis, the differential features of each cusp were analyzed. The scores of the new parameters differentiated the age and NTWI groups better than those analyzed in the former study. The Spearman ρ coefficients between the total area and the area of each cusp also showed higher scores at the levels of 0.6 mm (0.6A) and 0.8A. The mesiolingual cusp (MLC) showed a statistically significant difference ( P <0.01) from the other cusps in the paired t -test. Additionally, the MLC exhibited the highest percentage of change at 0.6A in some age and NTWI groups. Regarding the age groups, the MLC showed the highest score in groups 1 and 2. For the NTWI groups, the MLC was not significantly different in groups 3 and 4. These results support the proposal that the lingual cusp exhibits rapid wear because it serves as a functional cusp. Although this study has limitations due to its cross-sectional nature, it suggests better quantitative parameters and analytical tools for the characteristics of cusp wear.

  8. Quantitative assessment of biological impact using transcriptomic data and mechanistic network models

    International Nuclear Information System (INIS)

    Thomson, Ty M.; Sewer, Alain; Martin, Florian; Belcastro, Vincenzo; Frushour, Brian P.; Gebel, Stephan; Park, Jennifer; Schlage, Walter K.; Talikka, Marja; Vasilyev, Dmitry M.; Westra, Jurjen W.; Hoeng, Julia; Peitsch, Manuel C.

    2013-01-01

    Exposure to biologically active substances such as therapeutic drugs or environmental toxicants can impact biological systems at various levels, affecting individual molecules, signaling pathways, and overall cellular processes. The ability to derive mechanistic insights from the resulting system responses requires the integration of experimental measures with a priori knowledge about the system and the interacting molecules therein. We developed a novel systems biology-based methodology that leverages mechanistic network models and transcriptomic data to quantitatively assess the biological impact of exposures to active substances. Hierarchically organized network models were first constructed to provide a coherent framework for investigating the impact of exposures at the molecular, pathway and process levels. We then validated our methodology using novel and previously published experiments. For both in vitro systems with simple exposure and in vivo systems with complex exposures, our methodology was able to recapitulate known biological responses matching expected or measured phenotypes. In addition, the quantitative results were in agreement with experimental endpoint data for many of the mechanistic effects that were assessed, providing further objective confirmation of the approach. We conclude that our methodology evaluates the biological impact of exposures in an objective, systematic, and quantifiable manner, enabling the computation of a systems-wide and pan-mechanistic biological impact measure for a given active substance or mixture. Our results suggest that various fields of human disease research, from drug development to consumer product testing and environmental impact analysis, could benefit from using this methodology. - Highlights: • The impact of biologically active substances is quantified at multiple levels. • The systems-level impact integrates the perturbations of individual networks. • The networks capture the relationships between

  9. Validation of Quantitative Structure-Activity Relationship (QSAR Model for Photosensitizer Activity Prediction

    Directory of Open Access Journals (Sweden)

    Sharifuddin M. Zain

    2011-11-01

    Full Text Available Photodynamic therapy is a relatively new treatment method for cancer which utilizes a combination of oxygen, a photosensitizer and light to generate reactive singlet oxygen that eradicates tumors via direct cell-killing, vasculature damage and engagement of the immune system. Most of photosensitizers that are in clinical and pre-clinical assessments, or those that are already approved for clinical use, are mainly based on cyclic tetrapyrroles. In an attempt to discover new effective photosensitizers, we report the use of the quantitative structure-activity relationship (QSAR method to develop a model that could correlate the structural features of cyclic tetrapyrrole-based compounds with their photodynamic therapy (PDT activity. In this study, a set of 36 porphyrin derivatives was used in the model development where 24 of these compounds were in the training set and the remaining 12 compounds were in the test set. The development of the QSAR model involved the use of the multiple linear regression analysis (MLRA method. Based on the method, r2 value, r2 (CV value and r2 prediction value of 0.87, 0.71 and 0.70 were obtained. The QSAR model was also employed to predict the experimental compounds in an external test set. This external test set comprises 20 porphyrin-based compounds with experimental IC50 values ranging from 0.39 µM to 7.04 µM. Thus the model showed good correlative and predictive ability, with a predictive correlation coefficient (r2 prediction for external test set of 0.52. The developed QSAR model was used to discover some compounds as new lead photosensitizers from this external test set.

  10. GMM - a general microstructural model for qualitative and quantitative studies of smectite clays

    International Nuclear Information System (INIS)

    Pusch, R.; Karnland, O.; Hoekmark, H.

    1990-12-01

    A few years ago an attempt was made to accommodate a number of basic ideas on the fabric and interparticle forces that are assumed to be valid in montmorillonite clay in an integrated microstructural model and this resulted in an SKB report on 'Outlines of models of water and gas flow through smectite clay buffers'. This model gave reasonable agreement between predicted hydraulic conductivity values and actually recorded ones for room temperature and porewater that is poor in electrolytes. The present report describes an improved model that also accounts for effects generated by salt porewater and heating, and that provides a basis for both quantitative determination of transport capacities in a more general way, and also for analysis and prediction of rheological behaviour in bulk. It has been understood very early by investigators in this scientific field that full understanding of the physical state of porewater is asked for in order to make it possible to develop models for clay particle interaction. In particular, a deep insight in the nature of the interlamellar water and of the hydration mechanisms leading to an equilibrium state between the two types of water, and of forcefields in matured smectite clay, requires very qualified multi-discipline research and attempts have been made by the senior author to initiate and coordinate such work in the last 30 years. Despite this effort it has not been possible to get an unanimous understanding of these things but a number of major features have become more clear through the work that we have been able to carry out in the current SKB research work. Thus, NMR studies and precision measurements of the density of porewater as well as comprehensive electron microscopy and rheological testing in combination with application of stochastical mechanics, have led to the hypothetical microstructural model - the GMM - presented in this report. (au)

  11. Self-bridging of vertical silicon nanowires and a universal capacitive force model for spontaneous attraction in nanostructures.

    Science.gov (United States)

    Sun, Zhelin; Wang, Deli; Xiang, Jie

    2014-11-25

    Spontaneous attractions between free-standing nanostructures have often caused adhesion or stiction that affects a wide range of nanoscale devices, particularly nano/microelectromechanical systems. Previous understandings of the attraction mechanisms have included capillary force, van der Waals/Casimir forces, and surface polar charges. However, none of these mechanisms universally applies to simple semiconductor structures such as silicon nanowire arrays that often exhibit bunching or adhesions. Here we propose a simple capacitive force model to quantitatively study the universal spontaneous attraction that often causes stiction among semiconductor or metallic nanostructures such as vertical nanowire arrays with inevitably nonuniform size variations due to fabrication. When nanostructures are uniform in size, they share the same substrate potential. The presence of slight size differences will break the symmetry in the capacitive network formed between the nanowires, substrate, and their environment, giving rise to electrostatic attraction forces due to the relative potential difference between neighboring wires. Our model is experimentally verified using arrays of vertical silicon nanowire pairs with varied spacing, diameter, and size differences. Threshold nanowire spacing, diameter, or size difference between the nearest neighbors has been identified beyond which the nanowires start to exhibit spontaneous attraction that leads to bridging when electrostatic forces overcome elastic restoration forces. This work illustrates a universal understanding of spontaneous attraction that will impact the design, fabrication, and reliable operation of nanoscale devices and systems.

  12. Disentangling the Complexity of HGF Signaling by Combining Qualitative and Quantitative Modeling.

    Directory of Open Access Journals (Sweden)

    Lorenza A D'Alessandro

    2015-04-01

    Full Text Available Signaling pathways are characterized by crosstalk, feedback and feedforward mechanisms giving rise to highly complex and cell-context specific signaling networks. Dissecting the underlying relations is crucial to predict the impact of targeted perturbations. However, a major challenge in identifying cell-context specific signaling networks is the enormous number of potentially possible interactions. Here, we report a novel hybrid mathematical modeling strategy to systematically unravel hepatocyte growth factor (HGF stimulated phosphoinositide-3-kinase (PI3K and mitogen activated protein kinase (MAPK signaling, which critically contribute to liver regeneration. By combining time-resolved quantitative experimental data generated in primary mouse hepatocytes with interaction graph and ordinary differential equation modeling, we identify and experimentally validate a network structure that represents the experimental data best and indicates specific crosstalk mechanisms. Whereas the identified network is robust against single perturbations, combinatorial inhibition strategies are predicted that result in strong reduction of Akt and ERK activation. Thus, by capitalizing on the advantages of the two modeling approaches, we reduce the high combinatorial complexity and identify cell-context specific signaling networks.

  13. Observing Clonal Dynamics across Spatiotemporal Axes: A Prelude to Quantitative Fitness Models for Cancer.

    Science.gov (United States)

    McPherson, Andrew W; Chan, Fong Chun; Shah, Sohrab P

    2018-02-01

    The ability to accurately model evolutionary dynamics in cancer would allow for prediction of progression and response to therapy. As a prelude to quantitative understanding of evolutionary dynamics, researchers must gather observations of in vivo tumor evolution. High-throughput genome sequencing now provides the means to profile the mutational content of evolving tumor clones from patient biopsies. Together with the development of models of tumor evolution, reconstructing evolutionary histories of individual tumors generates hypotheses about the dynamics of evolution that produced the observed clones. In this review, we provide a brief overview of the concepts involved in predicting evolutionary histories, and provide a workflow based on bulk and targeted-genome sequencing. We then describe the application of this workflow to time series data obtained for transformed and progressed follicular lymphomas (FL), and contrast the observed evolutionary dynamics between these two subtypes. We next describe results from a spatial sampling study of high-grade serous (HGS) ovarian cancer, propose mechanisms of disease spread based on the observed clonal mixtures, and provide examples of diversification through subclonal acquisition of driver mutations and convergent evolution. Finally, we state implications of the techniques discussed in this review as a necessary but insufficient step on the path to predictive modelling of disease dynamics. Copyright © 2018 Cold Spring Harbor Laboratory Press; all rights reserved.

  14. Current Challenges in the First Principle Quantitative Modelling of the Lower Hybrid Current Drive in Tokamaks

    Science.gov (United States)

    Peysson, Y.; Bonoli, P. T.; Chen, J.; Garofalo, A.; Hillairet, J.; Li, M.; Qian, J.; Shiraiwa, S.; Decker, J.; Ding, B. J.; Ekedahl, A.; Goniche, M.; Zhai, X.

    2017-10-01

    The Lower Hybrid (LH) wave is widely used in existing tokamaks for tailoring current density profile or extending pulse duration to steady-state regimes. Its high efficiency makes it particularly attractive for a fusion reactor, leading to consider it for this purpose in ITER tokamak. Nevertheless, if basics of the LH wave in tokamak plasma are well known, quantitative modeling of experimental observations based on first principles remains a highly challenging exercise, despite considerable numerical efforts achieved so far. In this context, a rigorous methodology must be carried out in the simulations to identify the minimum number of physical mechanisms that must be considered to reproduce experimental shot to shot observations and also scalings (density, power spectrum). Based on recent simulations carried out for EAST, Alcator C-Mod and Tore Supra tokamaks, the state of the art in LH modeling is reviewed. The capability of fast electron bremsstrahlung, internal inductance li and LH driven current at zero loop voltage to constrain all together LH simulations is discussed, as well as the needs of further improvements (diagnostics, codes, LH model), for robust interpretative and predictive simulations.

  15. A Tuned Value Chain Model for University Based Public Research Organisation. Case Lut Cst.

    OpenAIRE

    Vesa Karvonen; Matti Karvonen; Andrzej Kraslawski

    2012-01-01

    The Porter´s value chain model was introduced for strategic business purposes. During the last decades also Universities and University based institutes have started to use actions similar to private business concepts. A University based institute is not independent actor like company but there are interest groups who are expecting them to act like they would be. This article discusses about the possibility of utilize tuned value chain to public research organizations (PRO). Also the interact...

  16. A Tuned Value Chain Model for University Based Public Research Organisation: Case Lut Cst

    OpenAIRE

    Karvonen, Vesa; Karvonen, Matti; Kraslawski, Andrzej

    2012-01-01

    The Porter´s value chain model was introduced for strategic business purposes. During the last decades also Universities and University based institutes have started to use actions similar to private business concepts. A University based institute is not independent actor like company but there are interest groups who are expecting them to act like they would be. This article discusses about the possibility of utilize tuned value chain to public research organizations (PRO). Also the interact...

  17. New holographic scalar field models of dark energy in non-flat universe

    Energy Technology Data Exchange (ETDEWEB)

    Karami, K., E-mail: KKarami@uok.ac.i [Department of Physics, University of Kurdistan, Pasdaran St., Sanandaj (Iran, Islamic Republic of); Research Institute for Astronomy and Astrophysics of Maragha (RIAAM), Maragha (Iran, Islamic Republic of); Fehri, J. [Department of Physics, University of Kurdistan, Pasdaran St., Sanandaj (Iran, Islamic Republic of)

    2010-02-08

    Motivated by the work of Granda and Oliveros [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199], we generalize their work to the non-flat case. We study the correspondence between the quintessence, tachyon, K-essence and dilaton scalar field models with the new holographic dark energy model in the non-flat FRW universe. We reconstruct the potentials and the dynamics for these scalar field models, which describe accelerated expansion of the universe. In the limiting case of a flat universe, i.e. k=0, all results given in [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199] are obtained.

  18. New holographic scalar field models of dark energy in non-flat universe

    International Nuclear Information System (INIS)

    Karami, K.; Fehri, J.

    2010-01-01

    Motivated by the work of Granda and Oliveros [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199], we generalize their work to the non-flat case. We study the correspondence between the quintessence, tachyon, K-essence and dilaton scalar field models with the new holographic dark energy model in the non-flat FRW universe. We reconstruct the potentials and the dynamics for these scalar field models, which describe accelerated expansion of the universe. In the limiting case of a flat universe, i.e. k=0, all results given in [L.N. Granda, A. Oliveros, Phys. Lett. B 671 (2009) 199] are obtained.

  19. Quantitative structure-activity relationship modeling of the toxicity of organothiophosphate pesticides to Daphnia magna and Cyprinus carpio

    NARCIS (Netherlands)

    Zvinavashe, E.; Du, T.; Griff, T.; Berg, van den J.H.J.; Soffers, A.E.M.F.; Vervoort, J.J.M.; Murk, A.J.; Rietjens, I.

    2009-01-01

    Within the REACH regulatory framework in the EU, quantitative structure-activity relationships (QSAR) models are expected to help reduce the number of animals used for experimental testing. The objective of this study was to develop QSAR models to describe the acute toxicity of organothiophosphate

  20. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    Science.gov (United States)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  1. Use Case Modelling of Bingham University Library Management ...

    African Journals Online (AJOL)

    With the advent of object oriented design, Unified Modelling Language (UML) has become prominent in software industry. Software is better modelled with the use of UML diagrams like use cases which provide a better flow of logic and comprehensive summary of the whole software system in a single illustration.

  2. A Leadership Model for University Geology Department Teacher Inservice Programs.

    Science.gov (United States)

    Sheldon, Daniel S.; And Others

    1983-01-01

    Provides geology departments and science educators with a leadership model for developing earth science inservice programs. Model emphasizes cooperation/coordination among departments, science educators, and curriculum specialists at local/intermediate/state levels. Includes rationale for inservice programs and geology department involvement in…

  3. Universal model of finite Reynolds number turbulent flow in channels and pipes

    NARCIS (Netherlands)

    L'vov, V.S.; Procaccia, I.; Rudenko, O.

    2008-01-01

    In this Letter, we suggest a simple and physically transparent analytical model of pressure driven turbulent wall-bounded flows at high but finite Reynolds numbers Re. The model provides an accurate quantitative description of the profiles of the mean-velocity and Reynolds stresses (second order

  4. Consistency of the tachyon warm inflationary universe models

    International Nuclear Information System (INIS)

    Zhang, Xiao-Min; Zhu, Jian-Yang

    2014-01-01

    This study concerns the consistency of the tachyon warm inflationary models. A linear stability analysis is performed to find the slow-roll conditions, characterized by the potential slow-roll (PSR) parameters, for the existence of a tachyon warm inflationary attractor in the system. The PSR parameters in the tachyon warm inflationary models are redefined. Two cases, an exponential potential and an inverse power-law potential, are studied, when the dissipative coefficient Γ = Γ 0 and Γ = Γ(φ), respectively. A crucial condition is obtained for a tachyon warm inflationary model characterized by the Hubble slow-roll (HSR) parameter ε H , and the condition is extendable to some other inflationary models as well. A proper number of e-folds is obtained in both cases of the tachyon warm inflation, in contrast to existing works. It is also found that a constant dissipative coefficient (Γ = Γ 0 ) is usually not a suitable assumption for a warm inflationary model

  5. UCODE, a computer code for universal inverse modeling

    Science.gov (United States)

    Poeter, E.P.; Hill, M.C.

    1999-01-01

    This article presents the US Geological Survey computer program UCODE, which was developed in collaboration with the US Army Corps of Engineers Waterways Experiment Station and the International Ground Water Modeling Center of the Colorado School of Mines. UCODE performs inverse modeling, posed as a parameter-estimation problem, using nonlinear regression. Any application model or set of models can be used; the only requirement is that they have numerical (ASCII or text only) input and output files and that the numbers in these files have sufficient significant digits. Application models can include preprocessors and postprocessors as well as models related to the processes of interest (physical, chemical and so on), making UCODE extremely powerful for model calibration. Estimated parameters can be defined flexibly with user-specified functions. Observations to be matched in the regression can be any quantity for which a simulated equivalent value can be produced, thus simulated equivalent values are calculated using values that appear in the application model output files and can be manipulated with additive and multiplicative functions, if necessary. Prior, or direct, information on estimated parameters also can be included in the regression. The nonlinear regression problem is solved by minimizing a weighted least-squares objective function with respect to the parameter values using a modified Gauss-Newton method. Sensitivities needed for the method are calculated approximately by forward or central differences and problems and solutions related to this approximation are discussed. Statistics are calculated and printed for use in (1) diagnosing inadequate data or identifying parameters that probably cannot be estimated with the available data, (2) evaluating estimated parameter values, (3) evaluating the model representation of the actual processes and (4) quantifying the uncertainty of model simulated values. UCODE is intended for use on any computer operating

  6. A quantitative evaluation of multiple biokinetic models using an assembled water phantom: A feasibility study.

    Directory of Open Access Journals (Sweden)

    Da-Ming Yeh

    Full Text Available This study examined the feasibility of quantitatively evaluating multiple biokinetic models and established the validity of the different compartment models using an assembled water phantom. Most commercialized phantoms are made to survey the imaging system since this is essential to increase the diagnostic accuracy for quality assurance. In contrast, few customized phantoms are specifically made to represent multi-compartment biokinetic models. This is because the complicated calculations as defined to solve the biokinetic models and the time-consuming verifications of the obtained solutions are impeded greatly the progress over the past decade. Nevertheless, in this work, five biokinetic models were separately defined by five groups of simultaneous differential equations to obtain the time-dependent radioactive concentration changes inside the water phantom. The water phantom was assembled by seven acrylic boxes in four different sizes, and the boxes were linked to varying combinations of hoses to signify the multiple biokinetic models from the biomedical perspective. The boxes that were connected by hoses were then regarded as a closed water loop with only one infusion and drain. 129.1±24.2 MBq of Tc-99m labeled methylene diphosphonate (MDP solution was thoroughly infused into the water boxes before gamma scanning; then the water was replaced with de-ionized water to simulate the biological removal rate among the boxes. The water was driven by an automatic infusion pump at 6.7 c.c./min, while the biological half-life of the four different-sized boxes (64, 144, 252, and 612 c.c. was 4.8, 10.7, 18.8, and 45.5 min, respectively. The five models of derived time-dependent concentrations for the boxes were estimated either by a self-developed program run in MATLAB or by scanning via a gamma camera facility. Either agreement or disagreement between the practical scanning and the theoretical prediction in five models was thoroughly discussed. The

  7. Universal Behavior of Few-Boson Systems Using Potential Models

    International Nuclear Information System (INIS)

    Kievsky, A.; Viviani, M.; Álvarez-Rodríguez, R.; Gattobigio, M.; Deltuva, A.

    2017-01-01

    The universal behavior of a three-boson system close to the unitary limit is encoded in a simple dependence of many observables in terms of few parameters. For example the product of the three-body parameter κ_∗ and the two-body scattering length a, κ_∗a depends on the angle ξ defined by E_3/E_2=tan"2ξ. A similar dependence is observed in the ratio a_A_D/a with a_A_D the boson-dimer scattering length. We use a two-parameter potential to determine this simple behavior and, as an application, to compute a_A_D for the case of three "4He atoms. (author)

  8. Analysis of Water Conflicts across Natural and Societal Boundaries: Integration of Quantitative Modeling and Qualitative Reasoning

    Science.gov (United States)

    Gao, Y.; Balaram, P.; Islam, S.

    2009-12-01

    , the knowledge generated from these studies cannot be easily generalized or transferred to other basins. Here, we present an approach to integrate the quantitative and qualitative methods to study water issues and capture the contextual knowledge of water management- by combining the NSSs framework and an area of artificial intelligence called qualitative reasoning. Using the Apalachicola-Chattahoochee-Flint (ACF) River Basin dispute as an example, we demonstrate how quantitative modeling and qualitative reasoning can be integrated to examine the impact of over abstraction of water from the river on the ecosystem and the role of governance in shaping the evolution of the ACF water dispute.

  9. Interacting polytropic gas model of phantom dark energy in non-flat universe

    International Nuclear Information System (INIS)

    Karami, K.; Ghaffari, S.; Fehri, J.

    2009-01-01

    By introducing the polytropic gas model of interacting dark energy, we obtain the equation of state for the polytropic gas energy density in a non-flat universe. We show that for an even polytropic index by choosing K>Ba (3)/(n) , one can obtain ω Λ eff <-1, which corresponds to a universe dominated by phantom dark energy. (orig.)

  10. Entropy - Some Cosmological Questions Answered by Model of Expansive Nondecelerative Universe

    Directory of Open Access Journals (Sweden)

    Miroslav Sukenik

    2003-01-01

    Full Text Available Abstract: The paper summarizes the background of Expansive Nondecelerative Universe model and its potential to offer answers to some open cosmological questions related to entropy. Three problems are faced in more detail, namely that of Hawkings phenomenon of black holes evaporation, maximum entropy of the Universe during its evolution, and time evolution of specific entropy.

  11. Compact baby universe model in ten dimension and probability function of quantum gravity

    International Nuclear Information System (INIS)

    Yan Jun; Hu Shike

    1991-01-01

    The quantum probability functions are calculated for ten-dimensional compact baby universe model. The authors find that the probability for the Yang-Mills baby universe to undergo a spontaneous compactification down to a four-dimensional spacetime is greater than that to remain in the original homogeneous multidimensional state. Some questions about large-wormhole catastrophe are also discussed

  12. The Analysis of Organizational Diagnosis on Based Six Box Model in Universities

    Science.gov (United States)

    Hamid, Rahimi; Siadat, Sayyed Ali; Reza, Hoveida; Arash, Shahin; Ali, Nasrabadi Hasan; Azizollah, Arbabisarjou

    2011-01-01

    Purpose: The analysis of organizational diagnosis on based six box model at universities. Research method: Research method was descriptive-survey. Statistical population consisted of 1544 faculty members of universities which through random strafed sampling method 218 persons were chosen as the sample. Research Instrument were organizational…

  13. The Charlotte Action Research Project: A Model for Direct and Mutually Beneficial Community-University Engagement

    Science.gov (United States)

    Morrell, Elizabeth; Sorensen, Janni; Howarth, Joe

    2015-01-01

    This article describes the evolution of the Charlotte Action Research Project (CHARP), a community-university partnership founded in 2008 at the University of North Carolina at Charlotte, and focuses particularly on the program's unique organizational structure. Research findings of a project evaluation suggest that the CHARP model's unique…

  14. Leo Szilard Lectureship Award Talk - Universal Scaling Laws from Cells to Cities; A Physicist's Search for Quantitative, Unified Theories of Biological and Social Structure and Dynamics

    Science.gov (United States)

    West, Geoffrey

    2013-04-01

    Many of the most challenging, exciting and profound questions facing science and society, from the origins of life to global sustainability, fall under the banner of ``complex adaptive systems.'' This talk explores how scaling can be used to begin to develop physics-inspired quantitative, predictive, coarse-grained theories for understanding their structure, dynamics and organization based on underlying mathematisable principles. Remarkably, most physiological, organisational and life history phenomena in biology and socio-economic systems scale in a simple and ``universal'' fashion: metabolic rate scales approximately as the 3/4-power of mass over 27 orders of magnitude from complex molecules to the largest organisms. Time-scales (such as lifespans and growth-rates) and sizes (such as genome lengths and RNA densities) scale with exponents which are typically simple multiples of 1/4, suggesting that fundamental constraints underlie much of the generic structure and dynamics of living systems. These scaling laws follow from dynamical and geometrical properties of space-filling, fractal-like, hierarchical branching networks, presumed optimised by natural selection. This leads to a general framework that potentially captures essential features of diverse systems including vasculature, ontogenetic growth, cancer, aging and mortality, sleep, cell size, and DNA nucleotide substitution rates. Cities and companies also scale: wages, profits, patents, crime, disease, pollution, road lengths scale similarly across the globe, reflecting underlying universal social network dynamics which point to general principles of organization transcending their individuality. These have dramatic implications for global sustainability: innovation and wealth creation that fuel social systems, left unchecked, potentially sow the seeds for their inevitable collapse.

  15. Education services quality of Kashan Medical Science University, based on SERVQUAL model in viewpoints of students

    Directory of Open Access Journals (Sweden)

    Ebrahim Kouchaki

    2017-01-01

    Full Text Available Introduction: Sustainable development of higher educational systems, as a dynamic system, requires a coherent moderate growth both in qualitative and quantitative dimensions. Since students are the major clients of higher education systems and their perspectives can play a key role in the quality promotion of the services; this study has been conducted based on SERVQUAL model aiming at the assessment of educational services quality in Kashan Medical Science University in 2016. Study Methodology: A total of 212 students of Kashan Medical Science University were selected with a population of 616 subjects through random sampling, using Morgan tables for this descriptive-analytical research. Data collection tools were the standard SERVQUAL questionnaire composing of three sections of basic information and 28 items, according to Likert six-option scale for the measurement of services quality current and desired expected conditions. The difference between the average of current and desirable statuses was measured as the services gap. Descriptive deductive statistics were used to analyze the obtained data. Results: The students aged averagely 23 ± 1.8, 65% (138 subjects were female, and 35% (74 subjects were male. About 72% (153 subjects were single, and 28% (59 subjects were married. The obtained results revealed that there was a negative gap in all dimensions of quality. The results also showed that the minimum gap obtained for learning assist tools (physical and tangibility dimensions with an amount of −0.38 and the maximum gap for guide instructor availability once needed by the students (accountability dimension with an amount of −2.42. Total mean of perceptions and expectations measurement for the students obtained 2.28 and 3.85, respectively. Conclusion: Respecting the negative gap obtained for all dimensions of educational services quality and insufficiencies to meet the students' expectations, it is recommended to assign further resources

  16. The Standard Model with one universal extra dimension

    Indian Academy of Sciences (India)

    An exhaustive list of the explicit expressions for all physical couplings induced by the ... the standard Green's functions, which implies that the Standard Model observables do ...... renormalizability of standard Green's functions is implicit in this.

  17. Game Based Learning (GBL) Adoption Model for Universities ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... faced while adopting Game Based Learning (GBL) model, its benefits and ... preferred traditional lectures styles, 7% online class and. 34% preferred .... students in developing problem-solving skills which in return may help ...

  18. Establishment of quantitative severity evaluation model for spinal cord injury by metabolomic fingerprinting.

    Directory of Open Access Journals (Sweden)

    Jin Peng

    Full Text Available Spinal cord injury (SCI is a devastating event with a limited hope for recovery and represents an enormous public health issue. It is crucial to understand the disturbances in the metabolic network after SCI to identify injury mechanisms and opportunities for treatment intervention. Through plasma 1H-nuclear magnetic resonance (NMR screening, we identified 15 metabolites that made up an "Eigen-metabolome" capable of distinguishing rats with severe SCI from healthy control rats. Forty enzymes regulated these 15 metabolites in the metabolic network. We also found that 16 metabolites regulated by 130 enzymes in the metabolic network impacted neurobehavioral recovery. Using the Eigen-metabolome, we established a linear discrimination model to cluster rats with severe and mild SCI and control rats into separate groups and identify the interactive relationships between metabolic biomarkers in the global metabolic network. We identified 10 clusters in the global metabolic network and defined them as distinct metabolic disturbance domains of SCI. Metabolic paths such as retinal, glycerophospholipid, arachidonic acid metabolism; NAD-NADPH conversion process, tyrosine metabolism, and cadaverine and putrescine metabolism were included. In summary, we presented a novel interdisciplinary method that integrates metabolomics and global metabolic network analysis to visualize metabolic network disturbances after SCI. Our study demonstrated the systems biological study paradigm that integration of 1H-NMR, metabolomics, and global metabolic network analysis is useful to visualize complex metabolic disturbances after severe SCI. Furthermore, our findings may provide a new quantitative injury severity evaluation model for clinical use.

  19. Currency risk and prices of oil and petroleum products: a simulation with a quantitative model

    International Nuclear Information System (INIS)

    Aniasi, L.; Ottavi, D.; Rubino, E.; Saracino, A.

    1992-01-01

    This paper analyzes the relationship between the exchange rates of the US Dollar against the four major European currencies and the prices of oil and its main products in those countries. In fact, the sensitivity of the prices to the exchange rate movements is of fundamental importance for the refining and distribution industries of importing countries. The result of the analysis shows that in neither free market conditions, as those present in Great Britain, France and Germany, nor in regulated markets, i.e. the italian one, do the variations of petroleum product prices fully absorb the variation of the exchange rates. In order to assess the above relationship, we first tested the order of co-integration of the time series of exchange rates of EMS currencies with those of international prices of oil and its derivative products; then we used a transfer-function model to reproduce the quantitative relationships between those variables. Using these results, we then reproduced domestic price functions with partial adjustment mechanisms. Finally, we used the above model to run a simulation of the deviation from the steady-state pattern caused by exchange-rate exogenous shocks. 21 refs., 5 figs., 3 tabs

  20. Quantitative modelling of amyloidogenic processing and its influence by SORLA in Alzheimer's disease.

    Science.gov (United States)

    Schmidt, Vanessa; Baum, Katharina; Lao, Angelyn; Rateitschak, Katja; Schmitz, Yvonne; Teichmann, Anke; Wiesner, Burkhard; Petersen, Claus Munck; Nykjaer, Anders; Wolf, Jana; Wolkenhauer, Olaf; Willnow, Thomas E

    2012-01-04

    The extent of proteolytic processing of the amyloid precursor protein (APP) into neurotoxic amyloid-β (Aβ) peptides is central to the pathology of Alzheimer's disease (AD). Accordingly, modifiers that increase Aβ production rates are risk factors in the sporadic form of AD. In a novel systems biology approach, we combined quantitative biochemical studies with mathematical modelling to establish a kinetic model of amyloidogenic processing, and to evaluate the influence by SORLA/SORL1, an inhibitor of APP processing and important genetic risk factor. Contrary to previous hypotheses, our studies demonstrate that secretases represent allosteric enzymes that require cooperativity by APP oligomerization for efficient processing. Cooperativity enables swift adaptive changes in secretase activity with even small alterations in APP concentration. We also show that SORLA prevents APP oligomerization both in cultured cells and in the brain in vivo, eliminating the preferred form of the substrate and causing secretases to switch to a less efficient non-allosteric mode of action. These data represent the first mathematical description of the contribution of genetic risk factors to AD substantiating the relevance of subtle changes in SORLA levels for amyloidogenic processing as proposed for patients carrying SORL1 risk alleles.

  1. Automatic and quantitative measurement of collagen gel contraction using model-guided segmentation

    Science.gov (United States)

    Chen, Hsin-Chen; Yang, Tai-Hua; Thoreson, Andrew R.; Zhao, Chunfeng; Amadio, Peter C.; Sun, Yung-Nien; Su, Fong-Chin; An, Kai-Nan

    2013-08-01

    Quantitative measurement of collagen gel contraction plays a critical role in the field of tissue engineering because it provides spatial-temporal assessment (e.g., changes of gel area and diameter during the contraction process) reflecting the cell behavior and tissue material properties. So far the assessment of collagen gels relies on manual segmentation, which is time-consuming and suffers from serious intra- and inter-observer variability. In this study, we propose an automatic method combining various image processing techniques to resolve these problems. The proposed method first detects the maximal feasible contraction range of circular references (e.g., culture dish) and avoids the interference of irrelevant objects in the given image. Then, a three-step color conversion strategy is applied to normalize and enhance the contrast between the gel and background. We subsequently introduce a deformable circular model which utilizes regional intensity contrast and circular shape constraint to locate the gel boundary. An adaptive weighting scheme was employed to coordinate the model behavior, so that the proposed system can overcome variations of gel boundary appearances at different contraction stages. Two measurements of collagen gels (i.e., area and diameter) can readily be obtained based on the segmentation results. Experimental results, including 120 gel images for accuracy validation, showed high agreement between the proposed method and manual segmentation with an average dice similarity coefficient larger than 0.95. The results also demonstrated obvious improvement in gel contours obtained by the proposed method over two popular, generic segmentation methods.

  2. Generating quantitative models describing the sequence specificity of biological processes with the stabilized matrix method

    Directory of Open Access Journals (Sweden)

    Sette Alessandro

    2005-05-01

    Full Text Available Abstract Background Many processes in molecular biology involve the recognition of short sequences of nucleic-or amino acids, such as the binding of immunogenic peptides to major histocompatibility complex (MHC molecules. From experimental data, a model of the sequence specificity of these processes can be constructed, such as a sequence motif, a scoring matrix or an artificial neural network. The purpose of these models is two-fold. First, they can provide a summary of experimental results, allowing for a deeper understanding of the mechanisms involved in sequence recognition. Second, such models can be used to predict the experimental outcome for yet untested sequences. In the past we reported the development of a method to generate such models called the Stabilized Matrix Method (SMM. This method has been successfully applied to predicting peptide binding to MHC molecules, peptide transport by the transporter associated with antigen presentation (TAP and proteasomal cleavage of protein sequences. Results Herein we report the implementation of the SMM algorithm as a publicly available software package. Specific features determining the type of problems the method is most appropriate for are discussed. Advantageous features of the package are: (1 the output generated is easy to interpret, (2 input and output are both quantitative, (3 specific computational strategies to handle experimental noise are built in, (4 the algorithm is designed to effectively handle bounded experimental data, (5 experimental data from randomized peptide libraries and conventional peptides can easily be combined, and (6 it is possible to incorporate pair interactions between positions of a sequence. Conclusion Making the SMM method publicly available enables bioinformaticians and experimental biologists to easily access it, to compare its performance to other prediction methods, and to extend it to other applications.

  3. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  4. Quantitative Phosphoproteomics Reveals Wee1 Kinase as a Therapeutic Target in a Model of Proneural Glioblastoma.

    Science.gov (United States)

    Lescarbeau, Rebecca S; Lei, Liang; Bakken, Katrina K; Sims, Peter A; Sarkaria, Jann N; Canoll, Peter; White, Forest M

    2016-06-01

    Glioblastoma (GBM) is the most common malignant primary brain cancer. With a median survival of about a year, new approaches to treating this disease are necessary. To identify signaling molecules regulating GBM progression in a genetically engineered murine model of proneural GBM, we quantified phosphotyrosine-mediated signaling using mass spectrometry. Oncogenic signals, including phosphorylated ERK MAPK, PI3K, and PDGFR, were found to be increased in the murine tumors relative to brain. Phosphorylation of CDK1 pY15, associated with the G2 arrest checkpoint, was identified as the most differentially phosphorylated site, with a 14-fold increase in phosphorylation in the tumors. To assess the role of this checkpoint as a potential therapeutic target, syngeneic primary cell lines derived from these tumors were treated with MK-1775, an inhibitor of Wee1, the kinase responsible for CDK1 Y15 phosphorylation. MK-1775 treatment led to mitotic catastrophe, as defined by increased DNA damage and cell death by apoptosis. To assess the extensibility of targeting Wee1/CDK1 in GBM, patient-derived xenograft (PDX) cell lines were also treated with MK-1775. Although the response was more heterogeneous, on-target Wee1 inhibition led to decreased CDK1 Y15 phosphorylation and increased DNA damage and apoptosis in each line. These results were also validated in vivo, where single-agent MK-1775 demonstrated an antitumor effect on a flank PDX tumor model, increasing mouse survival by 1.74-fold. This study highlights the ability of unbiased quantitative phosphoproteomics to reveal therapeutic targets in tumor models, and the potential for Wee1 inhibition as a treatment approach in preclinical models of GBM. Mol Cancer Ther; 15(6); 1332-43. ©2016 AACR. ©2016 American Association for Cancer Research.

  5. Modelling Facebook Usage among University Students in Thailand: The Role of Emotional Attachment in an Extended Technology Acceptance Model

    Science.gov (United States)

    Teo, Timothy

    2016-01-01

    The aim of this study is to examine the factors that influenced the use of Facebook among university students. Using an extended technology acceptance model (TAM) with emotional attachment (EA) as an external variable, a sample of 498 students from a public-funded Thailand university were surveyed on their responses to five variables hypothesized…

  6. Measuring effectiveness of a university by a parallel network DEA model

    Science.gov (United States)

    Kashim, Rosmaini; Kasim, Maznah Mat; Rahman, Rosshairy Abd

    2017-11-01

    Universities contribute significantly to the development of human capital and socio-economic improvement of a country. Due to that, Malaysian universities carried out various initiatives to improve their performance. Most studies have used the Data Envelopment Analysis (DEA) model to measure efficiency rather than effectiveness, even though, the measurement of effectiveness is important to realize how effective a university in achieving its ultimate goals. A university system has two major functions, namely teaching and research and every function has different resources based on its emphasis. Therefore, a university is actually structured as a parallel production system with its overall effectiveness is the aggregated effectiveness of teaching and research. Hence, this paper is proposing a parallel network DEA model to measure the effectiveness of a university. This model includes internal operations of both teaching and research functions into account in computing the effectiveness of a university system. In literature, the graduate and the number of program offered are defined as the outputs, then, the employed graduates and the numbers of programs accredited from professional bodies are considered as the outcomes for measuring the teaching effectiveness. Amount of grants is regarded as the output of research, while the different quality of publications considered as the outcomes of research. A system is considered effective if only all functions are effective. This model has been tested using a hypothetical set of data consisting of 14 faculties at a public university in Malaysia. The results show that none of the faculties is relatively effective for the overall performance. Three faculties are effective in teaching and two faculties are effective in research. The potential applications of the parallel network DEA model allow the top management of a university to identify weaknesses in any functions in their universities and take rational steps for improvement.

  7. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    Science.gov (United States)

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  8. A University/Community Collaborative Model on Empowerment in Elementary Education.

    Science.gov (United States)

    Goeke, John C.; And Others

    1995-01-01

    Collaboration is growing among schools and community services for youth, their families, and now, university graduate programs. Proposes a structural model for collaboration which implements the concept of empowerment and designs sustainable working relationships over time. (DR)

  9. DEVELOPMENT OF MODEL FOR QUANTITATIVE EVALUATION OF DYNAMICALLY STABLE FORMS OF RIVER CHANNELS

    Directory of Open Access Journals (Sweden)

    O. V. Zenkin

    2017-01-01

    systems. The determination of regularities of development of bed forms and quantitative relations between their parameters are based on modeling the “right” forms of riverbed.The research has resulted in establishing and testing methodology of simulation modeling, which allows one to identify dynamically stable form of riverbed. 

  10. Stability of the Einstein static universe in open cosmological models

    International Nuclear Information System (INIS)

    Canonico, Rosangela; Parisi, Luca

    2010-01-01

    The stability properties of the Einstein static solution of general relativity are altered when corrective terms arising from modification of the underlying gravitational theory appear in the cosmological equations. In this paper the existence and stability of static solutions are considered in the framework of two recently proposed quantum gravity models. The previously known analysis of the Einstein static solutions in the semiclassical regime of loop quantum cosmology with modifications to the gravitational sector is extended to open cosmological models where a static neutrally stable solution is found. A similar analysis is also performed in the framework of Horava-Lifshitz gravity under detailed balance and projectability conditions. In the case of open cosmological models the two solutions found can be either unstable or neutrally stable according to the admitted values of the parameters.

  11. The University – a Rational-Biologic Model

    Directory of Open Access Journals (Sweden)

    Ion Gh. Rosca

    2008-05-01

    Full Text Available The article advances the extension of the biologic rational model for the organizations, which are reprocessing and living in a turbulent environment. The current “tree” type organizations are not able to satisfy the requirements of the socio-economical environment and are not able to provide the organizational perpetuation and development. Thus, an innovative performing model for both the top and down management areas is presented, with the following recommendations: dividing the organization into departments using neuronal connections, focusing on the formatting processes and not on the activities, rethinking the system of a new organizational culture.

  12. Quantitative structure activity relationship model for predicting the depletion percentage of skin allergic chemical substances of glutathione

    International Nuclear Information System (INIS)

    Si Hongzong; Wang Tao; Zhang Kejun; Duan Yunbo; Yuan Shuping; Fu Aiping; Hu Zhide

    2007-01-01

    A quantitative model was developed to predict the depletion percentage of glutathione (DPG) compounds by gene expression programming (GEP). Each kind of compound was represented by several calculated structural descriptors involving constitutional, topological, geometrical, electrostatic and quantum-chemical features of compounds. The GEP method produced a nonlinear and five-descriptor quantitative model with a mean error and a correlation coefficient of 10.52 and 0.94 for the training set, 22.80 and 0.85 for the test set, respectively. It is shown that the GEP predicted results are in good agreement with experimental ones, better than those of the heuristic method

  13. Sustainability and scalability of university spinouts:a business model perspective

    OpenAIRE

    Ziaee Bigdeli, Ali; Li, Feng; Shi, Xiaohui

    2015-01-01

    Most previous studies of university spinouts (USOs) have focused on what determines their formation from the perspectives of the entrepreneurs or of their parent universities. However, few studies have investigated how these entrepreneurial businesses actually grow and how their business models evolve in the process. This paper examines the evolution of USOs' business models over their different development phases. Using empirical evidence gathered from three comprehensive case studies, we ex...

  14. Improved quantitative 90 Y bremsstrahlung SPECT/CT reconstruction with Monte Carlo scatter modeling.

    Science.gov (United States)

    Dewaraja, Yuni K; Chun, Se Young; Srinivasa, Ravi N; Kaza, Ravi K; Cuneo, Kyle C; Majdalany, Bill S; Novelli, Paula M; Ljungberg, Michael; Fessler, Jeffrey A

    2017-12-01

    In 90 Y microsphere radioembolization (RE), accurate post-therapy imaging-based dosimetry is important for establishing absorbed dose versus outcome relationships for developing future treatment planning strategies. Additionally, accurately assessing microsphere distributions is important because of concerns for unexpected activity deposition outside the liver. Quantitative 90 Y imaging by either SPECT or PET is challenging. In 90 Y SPECT model based methods are necessary for scatter correction because energy window-based methods are not feasible with the continuous bremsstrahlung energy spectrum. The objective of this work was to implement and evaluate a scatter estimation method for accurate 90 Y bremsstrahlung SPECT/CT imaging. Since a fully Monte Carlo (MC) approach to 90 Y SPECT reconstruction is computationally very demanding, in the present study the scatter estimate generated by a MC simulator was combined with an analytical projector in the 3D OS-EM reconstruction model. A single window (105 to 195-keV) was used for both the acquisition and the projector modeling. A liver/lung torso phantom with intrahepatic lesions and low-uptake extrahepatic objects was imaged to evaluate SPECT/CT reconstruction without and with scatter correction. Clinical application was demonstrated by applying the reconstruction approach to five patients treated with RE to determine lesion and normal liver activity concentrations using a (liver) relative calibration. There was convergence of the scatter estimate after just two updates, greatly reducing computational requirements. In the phantom study, compared with reconstruction without scatter correction, with MC scatter modeling there was substantial improvement in activity recovery in intrahepatic lesions (from > 55% to > 86%), normal liver (from 113% to 104%), and lungs (from 227% to 104%) with only a small degradation in noise (13% vs. 17%). Similarly, with scatter modeling contrast improved substantially both visually and in

  15. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  16. Study of primitive universe in the Bianchi IX model

    International Nuclear Information System (INIS)

    Matsas, G.E.A.

    1988-03-01

    The theory of general relativity is used to study the homogeneous cosmological model Bianch IX with isometry group SO(3) near the cosmological singularity. The Bogoyavlenskii-Novikov formalism to explain the anusual behaviour of the Liapunov exponent associated with this chaotic system, is introduced. (author) [pt

  17. A quantitative exposure model simulating human norovirus transmission during preparation of deli sandwiches.

    Science.gov (United States)

    Stals, Ambroos; Jacxsens, Liesbeth; Baert, Leen; Van Coillie, Els; Uyttendaele, Mieke

    2015-03-02

    Human noroviruses (HuNoVs) are a major cause of food borne gastroenteritis worldwide. They are often transmitted via infected and shedding food handlers manipulating foods such as deli sandwiches. The presented study aimed to simulate HuNoV transmission during the preparation of deli sandwiches in a sandwich bar. A quantitative exposure model was developed by combining the GoldSim® and @Risk® software packages. Input data were collected from scientific literature and from a two week observational study performed at two sandwich bars. The model included three food handlers working during a three hour shift on a shared working surface where deli sandwiches are prepared. The model consisted of three components. The first component simulated the preparation of the deli sandwiches and contained the HuNoV reservoirs, locations within the model allowing the accumulation of NoV and the working of intervention measures. The second component covered the contamination sources being (1) the initial HuNoV contaminated lettuce used on the sandwiches and (2) HuNoV originating from a shedding food handler. The third component included four possible intervention measures to reduce HuNoV transmission: hand and surface disinfection during preparation of the sandwiches, hand gloving and hand washing after a restroom visit. A single HuNoV shedding food handler could cause mean levels of 43±18, 81±37 and 18±7 HuNoV particles present on the deli sandwiches, hands and working surfaces, respectively. Introduction of contaminated lettuce as the only source of HuNoV resulted in the presence of 6.4±0.8 and 4.3±0.4 HuNoV on the food and hand reservoirs. The inclusion of hand and surface disinfection and hand gloving as a single intervention measure was not effective in the model as only marginal reductions of HuNoV levels were noticeable in the different reservoirs. High compliance of hand washing after a restroom visit did reduce HuNoV presence substantially on all reservoirs. The

  18. Model developments for quantitative estimates of the benefits of the signals on nuclear power plant availability and economics

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    1993-01-01

    A novel framework for quantitative estimates of the benefits of signals on nuclear power plant availability and economics has been developed in this work. The models developed in this work quantify how the perfect signals affect the human operator's success in restoring the power plant to the desired state when it enters undesirable transients. Also, the models quantify the economic benefits of these perfect signals. The models have been applied to the condensate feedwater system of the nuclear power plant for demonstration. (Author)

  19. New model of universal gas-filled neutron tube

    International Nuclear Information System (INIS)

    Bespalov, D.F.; Bessarabskii, I.G.; Voitsik, L.R.; Mints, A.Z.

    1985-01-01

    The UNG-1 gas-filled neutron tube is serially produced. In type UNG neutron generators, the tube operates in the pulsed mode in the high voltage doubling circuit arrangement. During extended operation, its advantages were discovered: long operating time, fairly stable neutron yield, and simplicity of use and operation. However, the mean neutron yield (approx.10 7 s -1 ) generated by the tube in the optimal mode at the present time proved to be inadequate in solving numerous geophysical problems. So a model of a neutron tube, model UNG-2, was designed, ensuring an enhanced neutron yield of 10 8 s -1 in the continuous-operating mode. When the tube is connected to the high voltage doubling circuit, the mean neutron yield is only somewhat in excess of the neutron yield from the UNG-1 tube

  20. A universal model for languages and cities, and their lifetimes

    OpenAIRE

    Tuncay, Caglar

    2007-01-01

    Present human languages display slightly asymmetric log-normal (Gauss) distribution for size [1-3], whereas present cities follow power law (Pareto-Zipf law)[4]. Our model considers the competition between languages and that between cities in terms of growing (multiplicative noise process)[5] and fragmentation [6]; where, relevant parameters are (naturally) different for languages and cities. We consider lifetime distribution for old and living languages and that for old and living cities. We...

  1. A MODEL OF STUDENTS’ UNIVERSITY DECISION-MAKING BEHAVIOR

    OpenAIRE

    Ionela MANIU; George C. MANIU

    2014-01-01

    Over the last decade the higher education institutional framework suffered a major transformation:the increasing influence of market competition on academic life - “marketization”.Consequently, HEI attention is increasingly focused on attracting high quality (human) resources and students. Such context demands a deeper understanding of students’ decision making process for HEI. Literature on higher education management provides a large number of models, which attempt to provide a...

  2. Exploring the common molecular basis for the universal DNA mutation bias: Revival of Loewdin mutation model

    International Nuclear Information System (INIS)

    Fu, Liang-Yu; Wang, Guang-Zhong; Ma, Bin-Guang; Zhang, Hong-Yu

    2011-01-01

    Highlights: → There exists a universal G:C → A:T mutation bias in three domains of life. → This universal mutation bias has not been sufficiently explained. → A DNA mutation model proposed by Loewdin 40 years ago offers a common explanation. -- Abstract: Recently, numerous genome analyses revealed the existence of a universal G:C → A:T mutation bias in bacteria, fungi, plants and animals. To explore the molecular basis for this mutation bias, we examined the three well-known DNA mutation models, i.e., oxidative damage model, UV-radiation damage model and CpG hypermutation model. It was revealed that these models cannot provide a sufficient explanation to the universal mutation bias. Therefore, we resorted to a DNA mutation model proposed by Loewdin 40 years ago, which was based on inter-base double proton transfers (DPT). Since DPT is a fundamental and spontaneous chemical process and occurs much more frequently within GC pairs than AT pairs, Loewdin model offers a common explanation for the observed universal mutation bias and thus has broad biological implications.

  3. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    Science.gov (United States)

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  4. Non-universal spreading exponents in a catalytic reaction model

    International Nuclear Information System (INIS)

    De Andrade, Marcelo F; Figueiredo, W

    2011-01-01

    We investigated the dependence of the spreading critical exponents and the ultimate survival probability exponent on the initial configuration of a nonequilibrium catalytic reaction model. The model considers the competitive reactions between two different monomers, A and B, where we take into account the energy couplings between nearest neighbor monomers, and the adsorption energies, as well as the temperature T of the catalyst. For each value of T the model shows distinct absorbing states, with different concentrations of the two monomers. Employing an epidemic analysis, we established the behavior of the spreading exponents as we started the Monte Carlo simulations with different concentrations of the monomers. The exponents were determined as a function of the initial concentration ρ A, ini of A monomers. We have also considered initial configurations with correlations for a fixed concentration of A monomers. From the determination of three spreading exponents, and the ultimate survival probability exponent, we checked the validity of the generalized hyperscaling relation for a continuous set of initial states, random and correlated, which are dependent on the temperature of the catalyst

  5. Environmental determinants of tropical forest and savanna distribution: A quantitative model evaluation and its implication

    Science.gov (United States)

    Zeng, Zhenzhong; Chen, Anping; Piao, Shilong; Rabin, Sam; Shen, Zehao

    2014-07-01

    The distributions of tropical ecosystems are rapidly being altered by climate change and anthropogenic activities. One possible trend—the loss of tropical forests and replacement by savannas—could result in significant shifts in ecosystem services and biodiversity loss. However, the influence and the relative importance of environmental factors in regulating the distribution of tropical forest and savanna biomes are still poorly understood, which makes it difficult to predict future tropical forest and savanna distributions in the context of climate change. Here we use boosted regression trees to quantitatively evaluate the importance of environmental predictors—mainly climatic, edaphic, and fire factors—for the tropical forest-savanna distribution at a mesoscale across the tropics (between 15°N and 35°S). Our results demonstrate that climate alone can explain most of the distribution of tropical forest and savanna at the scale considered; dry season average precipitation is the single most important determinant across tropical Asia-Australia, Africa, and South America. Given the strong tendency of increased seasonality and decreased dry season precipitation predicted by global climate models, we estimate that about 28% of what is now tropical forest would likely be lost to savanna by the late 21st century under the future scenario considered. This study highlights the importance of climate seasonality and interannual variability in predicting the distribution of tropical forest and savanna, supporting the climate as the primary driver in the savanna biogeography.

  6. Model-independent quantitative measurement of nanomechanical oscillator vibrations using electron-microscope linescans

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Huan; Fenton, J. C.; Chiatti, O. [London Centre for Nanotechnology, University College London, 17–19 Gordon Street, London WC1H 0AH (United Kingdom); Warburton, P. A. [London Centre for Nanotechnology, University College London, 17–19 Gordon Street, London WC1H 0AH (United Kingdom); Department of Electronic and Electrical Engineering, University College London, Torrington Place, London WC1E 7JE (United Kingdom)

    2013-07-15

    Nanoscale mechanical resonators are highly sensitive devices and, therefore, for application as highly sensitive mass balances, they are potentially superior to micromachined cantilevers. The absolute measurement of nanoscale displacements of such resonators remains a challenge, however, since the optical signal reflected from a cantilever whose dimensions are sub-wavelength is at best very weak. We describe a technique for quantitative analysis and fitting of scanning-electron microscope (SEM) linescans across a cantilever resonator, involving deconvolution from the vibrating resonator profile using the stationary resonator profile. This enables determination of the absolute amplitude of nanomechanical cantilever oscillations even when the oscillation amplitude is much smaller than the cantilever width. This technique is independent of any model of secondary-electron emission from the resonator and is, therefore, applicable to resonators with arbitrary geometry and material inhomogeneity. We demonstrate the technique using focussed-ion-beam–deposited tungsten cantilevers of radius ∼60–170 nm inside a field-emission SEM, with excitation of the cantilever by a piezoelectric actuator allowing measurement of the full frequency response. Oscillation amplitudes approaching the size of the primary electron-beam can be resolved. We further show that the optimum electron-beam scan speed is determined by a compromise between deflection of the cantilever at low scan speeds and limited spatial resolution at high scan speeds. Our technique will be an important tool for use in precise characterization of nanomechanical resonator devices.

  7. Quantitative analysis of aqueous phase composition of model dentin adhesives experiencing phase separation

    Science.gov (United States)

    Ye, Qiang; Park, Jonggu; Parthasarathy, Ranganathan; Pamatmat, Francis; Misra, Anil; Laurence, Jennifer S.; Marangos, Orestes; Spencer, Paulette

    2013-01-01

    There have been reports of the sensitivity of our current dentin adhesives to excess moisture, for example, water-blisters in adhesives placed on over-wet surfaces, and phase separation with concomitant limited infiltration of the critical dimethacrylate component into the demineralized dentin matrix. To determine quantitatively the hydrophobic/hydrophilic components in the aqueous phase when exposed to over-wet environments, model adhesives were mixed with 16, 33, and 50 wt % water to yield well-separated phases. Based upon high-performance liquid chromatography coupled with photodiode array detection, it was found that the amounts of hydrophobic BisGMA and hydrophobic initiators are less than 0.1 wt % in the aqueous phase. The amount of these compounds decreased with an increase in the initial water content. The major components of the aqueous phase were hydroxyethyl methacrylate (HEMA) and water, and the HEMA content ranged from 18.3 to 14.7 wt %. Different BisGMA homologues and the relative content of these homologues in the aqueous phase have been identified; however, the amount of crosslinkable BisGMA was minimal and, thus, could not help in the formation of a crosslinked polymer network in the aqueous phase. Without the protection afforded by a strong crosslinked network, the poorly photoreactive compounds of this aqueous phase could be leached easily. These results suggest that adhesive formulations should be designed to include hydrophilic multimethacrylate monomers and water compatible initiators. PMID:22331596

  8. Microsegregation in multicomponent alloy analysed by quantitative phase-field model

    International Nuclear Information System (INIS)

    Ohno, M; Takaki, T; Shibuta, Y

    2015-01-01

    Microsegregation behaviour in a ternary alloy system has been analysed by means of quantitative phase-field (Q-PF) simulations with a particular attention directed at an influence of tie-line shift stemming from different liquid diffusivities of the solute elements. The Q-PF model developed for non-isothermal solidification in multicomponent alloys with non-zero solid diffusivities was applied to analysis of microsegregation in a ternary alloy consisting of fast and slow diffusing solute elements. The accuracy of the Q-PF simulation was first verified by performing the convergence test of segregation ratio with respect to the interface thickness. From one-dimensional analysis, it was found that the microsegregation of slow diffusing element is reduced due to the tie-line shift. In two-dimensional simulations, the refinement of microstructure, viz., the decrease of secondary arms spacing occurs at low cooling rates due to the formation of diffusion layer of slow diffusing element. It yields the reductions of degrees of microsegregation for both the fast and slow diffusing elements. Importantly, in a wide range of cooling rates, the degree of microsegregation of the slow diffusing element is always lower than that of the fast diffusing element, which is entirely ascribable to the influence of tie-line shift. (paper)

  9. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    International Nuclear Information System (INIS)

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-01-01

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  10. Quantitative assessment of bone defect healing by multidetector CT in a pig model

    International Nuclear Information System (INIS)

    Riegger, Carolin; Kroepil, Patric; Lanzman, Rotem S.; Miese, Falk R.; Antoch, Gerald; Scherer, Axel; Jungbluth, Pascal; Hakimi, Mohssen; Wild, Michael; Hakimi, Ahmad R.

    2012-01-01

    To evaluate multidetector CT volumetry in the assessment of bone defect healing in comparison to histopathological findings in an animal model. In 16 mini-pigs, a circumscribed tibial bone defect was created. Multidetector CT (MDCT) of the tibia was performed on a 64-row scanner 42 days after the operation. The extent of bone healing was estimated quantitatively by MDCT volumetry using a commercially available software programme (syngo Volume, Siemens, Germany).The volume of the entire defect (including all pixels from -100 to 3,000 HU), the nonconsolidated areas (-100 to 500 HU), and areas of osseous consolidation (500 to 3,000 HU) were assessed and the extent of consolidation was calculated. Histomorphometry served as the reference standard. The extent of osseous consolidation in MDCT volumetry ranged from 19 to 92% (mean 65.4 ± 18.5%). There was a significant correlation between histologically visible newly formed bone and the extent of osseous consolidation on MDCT volumetry (r = 0.82, P < 0.0001). A significant negative correlation was detected between osseous consolidation on MDCT and histological areas of persisting defect (r = -0.9, P < 0.0001). MDCT volumetry is a promising tool for noninvasive monitoring of bone healing, showing excellent correlation with histomorphometry. (orig.)

  11. Quantitative assessment of bone defect healing by multidetector CT in a pig model

    Energy Technology Data Exchange (ETDEWEB)

    Riegger, Carolin; Kroepil, Patric; Lanzman, Rotem S.; Miese, Falk R.; Antoch, Gerald; Scherer, Axel [University Duesseldorf, Medical Faculty, Department of Diagnostic and Interventional Radiology, Duesseldorf (Germany); Jungbluth, Pascal; Hakimi, Mohssen; Wild, Michael [University Duesseldorf, Medical Faculty, Department of Traumatology and Hand Surgery, Duesseldorf (Germany); Hakimi, Ahmad R. [Universtity Duesseldorf, Medical Faculty, Department of Oral Surgery, Duesseldorf (Germany)

    2012-05-15

    To evaluate multidetector CT volumetry in the assessment of bone defect healing in comparison to histopathological findings in an animal model. In 16 mini-pigs, a circumscribed tibial bone defect was created. Multidetector CT (MDCT) of the tibia was performed on a 64-row scanner 42 days after the operation. The extent of bone healing was estimated quantitatively by MDCT volumetry using a commercially available software programme (syngo Volume, Siemens, Germany).The volume of the entire defect (including all pixels from -100 to 3,000 HU), the nonconsolidated areas (-100 to 500 HU), and areas of osseous consolidation (500 to 3,000 HU) were assessed and the extent of consolidation was calculated. Histomorphometry served as the reference standard. The extent of osseous consolidation in MDCT volumetry ranged from 19 to 92% (mean 65.4 {+-} 18.5%). There was a significant correlation between histologically visible newly formed bone and the extent of osseous consolidation on MDCT volumetry (r = 0.82, P < 0.0001). A significant negative correlation was detected between osseous consolidation on MDCT and histological areas of persisting defect (r = -0.9, P < 0.0001). MDCT volumetry is a promising tool for noninvasive monitoring of bone healing, showing excellent correlation with histomorphometry. (orig.)

  12. A two-locus model of spatially varying stabilizing or directional selection on a quantitative trait.

    Science.gov (United States)

    Geroldinger, Ludwig; Bürger, Reinhard

    2014-06-01

    The consequences of spatially varying, stabilizing or directional selection on a quantitative trait in a subdivided population are studied. A deterministic two-locus two-deme model is employed to explore the effects of migration, the degree of divergent selection, and the genetic architecture, i.e., the recombination rate and ratio of locus effects, on the maintenance of genetic variation. The possible equilibrium configurations are determined as functions of the migration rate. They depend crucially on the strength of divergent selection and the genetic architecture. The maximum migration rates are investigated below which a stable fully polymorphic equilibrium or a stable single-locus polymorphism can exist. Under stabilizing selection, but with different optima in the demes, strong recombination may facilitate the maintenance of polymorphism. However usually, and in particular with directional selection in opposite direction, the critical migration rates are maximized by a concentrated genetic architecture, i.e., by a major locus and a tightly linked minor one. Thus, complementing previous work on the evolution of genetic architectures in subdivided populations subject to diversifying selection, it is shown that concentrated architectures may aid the maintenance of polymorphism. Conditions are obtained when this is the case. Finally, the dependence of the phenotypic variance, linkage disequilibrium, and various measures of local adaptation and differentiation on the parameters is elaborated. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    Science.gov (United States)

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  14. Analysis of genetic effects of nuclear-cytoplasmic interaction on quantitative traits: genetic model for diploid plants.

    Science.gov (United States)

    Han, Lide; Yang, Jian; Zhu, Jun

    2007-06-01

    A genetic model was proposed for simultaneously analyzing genetic effects of nuclear, cytoplasm, and nuclear-cytoplasmic interaction (NCI) as well as their genotype by environment (GE) interaction for quantitative traits of diploid plants. In the model, the NCI effects were further partitioned into additive and dominance nuclear-cytoplasmic interaction components. Mixed linear model approaches were used for statistical analysis. On the basis of diallel cross designs, Monte Carlo simulations showed that the genetic model was robust for estimating variance components under several situations without specific effects. Random genetic effects were predicted by an adjusted unbiased prediction (AUP) method. Data on four quantitative traits (boll number, lint percentage, fiber length, and micronaire) in Upland cotton (Gossypium hirsutum L.) were analyzed as a worked example to show the effectiveness of the model.

  15. ONLINE MODEL OF EDUCATION QUALITY ASSURANCE EQUASP IMPLEMENTATION: EXPERIENCE OF VYATKA STATE UNIVERSITY

    Directory of Open Access Journals (Sweden)

    Valentin Pugach

    2015-10-01

    Full Text Available The article is devoted to the problem of assessing the quality of higher education. In the Russian Federation recently quality assessment of educational services provided by state-accredited universities is carried out by the state represented by the Ministry of education and science. State universities have simulated internal systemseducation quality assessment in accordance with the methodology proposed by the Ministry of education and science. Currently more attention is paid to the independent assessment of education quality which is the basis of professional public accreditation. The project "EQUASP" financed within the framework of the TEMPUS programme is directed to the problem of implementing the methodology of the online model of independent higher education quality assessment in the practice of Russian universities. The proposed model for assessing the quality of education is based on usage of 5 standards. The authors have done a comparative analysis of the model of higher education quality assessment existing in Vyatka State University and the model of education quality assessing offered by European universities-participants of the project EQUASP. The authors have presented the main results of investigation of this problem and some suggestions for improving the model of education quality assessment used by Vyatka State University.

  16. Generalized cardassian expansion: a model in which the universe is flat, matter dominated, and accelerating

    International Nuclear Information System (INIS)

    Freese, Katherine

    2003-01-01

    The Cardassian universe is a proposed modification to the Friedmann Robertson Walker equation (FRW) in which the universe is flat, matter dominated, and accelerating. In this presentation, we generalize the original Cardassian proposal to include additional variants on the FRW equation, specific examples are presented. In the ordinary FRW equation, the right hand side is a linear function of the energy density, H 2 ∼ ρ. Here, instead, the right hand side of the FRW equation is a different function of the energy density, H 2 ∼ g(ρ). This function returns to ordinary FRW at early times, but modifies the expansion at a late epoch of the universe. The only ingredients in this universe are matter and radiation: in particular, there is NO vacuum contribution. Currently the modification of the FRW equation is such that the universe accelerates; we call this period of acceleration the Cardassian era. The universe can be flat and yet consist of only matter and radiation, and still be compatible with observations. The energy density required to close the universe is much smaller than in a standard cosmology, so that matter can be sufficient to provide a flat geometry. The new term required may arise, e.g., as a consequence of our observable universe living as a 3-dimensional brane in a higher dimensional universe. The Cardassian model survives several observational tests, including the cosmic background radiation, the age of the universe, the Friedmann Robertson , and structure formation. As will be shown in future work, he predictions for observational tests of the generalized Cardassian models can be very different from generic quintessence models, whether the equation of state is constant or time dependent

  17. Animal Models for Influenza Viruses: Implications for Universal Vaccine Development

    Directory of Open Access Journals (Sweden)

    Irina Margine

    2014-10-01

    Full Text Available Influenza virus infections are a significant cause of morbidity and mortality in the human population. Depending on the virulence of the influenza virus strain, as well as the immunological status of the infected individual, the severity of the respiratory disease may range from sub-clinical or mild symptoms to severe pneumonia that can sometimes lead to death. Vaccines remain the primary public health measure in reducing the influenza burden. Though the first influenza vaccine preparation was licensed more than 60 years ago, current research efforts seek to develop novel vaccination strategies with improved immunogenicity, effectiveness, and breadth of protection. Animal models of influenza have been essential in facilitating studies aimed at understanding viral factors that affect pathogenesis and contribute to disease or transmission. Among others, mice, ferrets, pigs, and nonhuman primates have been used to study influenza virus infection in vivo, as well as to do pre-clinical testing of novel vaccine approaches. Here we discuss and compare the unique advantages and limitations of each model.

  18. Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions

    Science.gov (United States)

    Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for

  19. Foundations for quantitative microstructural models to track evolution of the metallurgical state during high purity Nb cavity fabrication

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, Thomas R [Michigan State University; Wright, Neil T [Michigan State University; Compton, Chris C [Facility for Rare Isotope Beams

    2014-03-15

    The goal of the Materials Science SRF Cavity Group of Michigan State University and the National Superconducting Cyclotron has been (and continues to be) to understand quantitatively the effects of process history on functional properties. These relationships were assessed via studies on Nb samples and cavity parts, which had various combinations of forming processes, welding, heat treatments, and surface preparation. A primary focus was on large-grain cavity building strategies. Effects of processing operations and exposure to hydrogen on the thermal conductivity has been identified in single and bi-crystal samples, showing that the thermal conductivity can be altered by a factor of 5 depending on process history. Characterization of single crystal tensile samples show a strong effect of crystal orientation on deformation resistance and shape changes. Large grain half cells were examined to characterize defect content and surface damage effects, which provided quantitative information about the depth damage layers from forming.

  20. MODIS volcanic ash retrievals vs FALL3D transport model: a quantitative comparison

    Science.gov (United States)

    Corradini, S.; Merucci, L.; Folch, A.

    2010-12-01

    Satellite retrievals and transport models represents the key tools to monitor the volcanic clouds evolution. Because of the harming effects of fine ash particles on aircrafts, the real-time tracking and forecasting of volcanic clouds is key for aviation safety. Together with the security reasons also the economical consequences of a disruption of airports must be taken into account. The airport closures due to the recent Icelandic Eyjafjöll eruption caused millions of passengers to be stranded not only in Europe, but across the world. IATA (the International Air Transport Association) estimates that the worldwide airline industry has lost a total of about 2.5 billion of Euro during the disruption. Both security and economical issues require reliable and robust ash cloud retrievals and trajectory forecasting. The intercomparison between remote sensing and modeling is required to assure precise and reliable volcanic ash products. In this work we perform a quantitative comparison between Moderate Resolution Imaging Spectroradiometer (MODIS) retrievals of volcanic ash cloud mass and Aerosol Optical Depth (AOD) with the FALL3D ash dispersal model. MODIS, aboard the NASA-Terra and NASA-Aqua polar satellites, is a multispectral instrument with 36 spectral bands operating in the VIS-TIR spectral range and spatial resolution varying between 250 and 1000 m at nadir. The MODIS channels centered around 11 and 12 micron have been used for the ash retrievals through the Brightness Temperature Difference algorithm and MODTRAN simulations. FALL3D is a 3-D time-dependent Eulerian model for the transport and deposition of volcanic particles that outputs, among other variables, cloud column mass and AOD. Three MODIS images collected the October 28, 29 and 30 on Mt. Etna volcano during the 2002 eruption have been considered as test cases. The results show a general good agreement between the retrieved and the modeled volcanic clouds in the first 300 km from the vents. Even if the

  1. A Quantitative Human Spacecraft Design Evaluation Model for Assessing Crew Accommodation and Utilization

    Science.gov (United States)

    Fanchiang, Christine

    Crew performance, including both accommodation and utilization factors, is an integral part of every human spaceflight mission from commercial space tourism, to the demanding journey to Mars and beyond. Spacecraft were historically built by engineers and technologists trying to adapt the vehicle into cutting edge rocketry with the assumption that the astronauts could be trained and will adapt to the design. By and large, that is still the current state of the art. It is recognized, however, that poor human-machine design integration can lead to catastrophic and deadly mishaps. The premise of this work relies on the idea that if an accurate predictive model exists to forecast crew performance issues as a result of spacecraft design and operations, it can help designers and managers make better decisions throughout the design process, and ensure that the crewmembers are well-integrated with the system from the very start. The result should be a high-quality, user-friendly spacecraft that optimizes the utilization of the crew while keeping them alive, healthy, and happy during the course of the mission. Therefore, the goal of this work was to develop an integrative framework to quantitatively evaluate a spacecraft design from the crew performance perspective. The approach presented here is done at a very fundamental level starting with identifying and defining basic terminology, and then builds up important axioms of human spaceflight that lay the foundation for how such a framework can be developed. With the framework established, a methodology for characterizing the outcome using a mathematical model was developed by pulling from existing metrics and data collected on human performance in space. Representative test scenarios were run to show what information could be garnered and how it could be applied as a useful, understandable metric for future spacecraft design. While the model is the primary tangible product from this research, the more interesting outcome of

  2. Quantitative acid-base physiology using the Stewart model. Does it improve our understanding of what is really wrong?

    NARCIS (Netherlands)

    Derksen, R.; Scheffer, G.J.; Hoeven, J.G. van der

    2006-01-01

    Traditional theories of acid-base balance are based on the Henderson-Hasselbalch equation to calculate proton concentration. The recent revival of quantitative acid-base physiology using the Stewart model has increased our understanding of complicated acid-base disorders, but has also led to several

  3. Physically based dynamic run-out modelling for quantitative debris flow risk assessment: a case study in Tresenda, northern Italy

    Czech Academy of Sciences Publication Activity Database

    Quan Luna, B.; Blahůt, Jan; Camera, C.; Van Westen, C.; Apuani, T.; Jetten, V.; Sterlacchini, S.

    2014-01-01

    Roč. 72, č. 3 (2014), s. 645-661 ISSN 1866-6280 Institutional support: RVO:67985891 Keywords : debris flow * FLO-2D * run-out * quantitative hazard and risk assessment * vulnerability * numerical modelling Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.765, year: 2014

  4. A Quantitative Study of Faculty Perceptions and Attitudes on Asynchronous Virtual Teamwork Using the Technology Acceptance Model

    Science.gov (United States)

    Wolusky, G. Anthony

    2016-01-01

    This quantitative study used a web-based questionnaire to assess the attitudes and perceptions of online and hybrid faculty towards student-centered asynchronous virtual teamwork (AVT) using the technology acceptance model (TAM) of Davis (1989). AVT is online student participation in a team approach to problem-solving culminating in a written…

  5. Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships (MCBIOS)

    Science.gov (United States)

    Comparative Analysis of Predictive Models for Liver Toxicity Using ToxCast Assays and Quantitative Structure-Activity Relationships Jie Liu1,2, Richard Judson1, Matthew T. Martin1, Huixiao Hong3, Imran Shah1 1National Center for Computational Toxicology (NCCT), US EPA, RTP, NC...

  6. Quantitative predictions from competition theory with incomplete information on model parameters tested against experiments across diverse taxa

    OpenAIRE

    Fort, Hugo

    2017-01-01

    We derive an analytical approximation for making quantitative predictions for ecological communities as a function of the mean intensity of the inter-specific competition and the species richness. This method, with only a fraction of the model parameters (carrying capacities and competition coefficients), is able to predict accurately empirical measurements covering a wide variety of taxa (algae, plants, protozoa).

  7. Using ISOS consensus test protocols for development of quantitative life test models in ageing of organic solar cells

    DEFF Research Database (Denmark)

    Kettle, J.; Stoichkov, V.; Kumar, D.

    2017-01-01

    As Organic Photovoltaic (OPV) development matures, the demand grows for rapid characterisation of degradation and application of Quantitative Accelerated Life Tests (QALT) models to predict and improve reliability. To date, most accelerated testing on OPVs has been conducted using ISOS consensus...

  8. Quantitative trait loci affecting phenotypic variation in the vacuolated lens mouse mutant, a multigenic mouse model of neural tube defects

    NARCIS (Netherlands)

    Korstanje, Ron; Desai, Jigar; Lazar, Gloria; King, Benjamin; Rollins, Jarod; Spurr, Melissa; Joseph, Jamie; Kadambi, Sindhuja; Li, Yang; Cherry, Allison; Matteson, Paul G.; Paigen, Beverly; Millonig, James H.

    Korstanje R, Desai J, Lazar G, King B, Rollins J, Spurr M, Joseph J, Kadambi S, Li Y, Cherry A, Matteson PG, Paigen B, Millonig JH. Quantitative trait loci affecting phenotypic variation in the vacuolated lens mouse mutant, a multigenic mouse model of neural tube defects. Physiol Genomics 35:

  9. Model development for quantitative evaluation of nuclear fuel cycle alternatives and its application

    International Nuclear Information System (INIS)

    Ko, Won Il

    2000-02-01

    This study addresses the quantitative evaluation of the proliferation resistance and the economics which are important factors of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles, and a fuel cycle cost analysis model was suggested to incorporate various uncertainties in the fuel cycle cost calculation. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. In this model, the proliferation resistance was described an a relative size of the barrier that must be overcome in order to acquire nuclear weapons. Therefore, a larger barriers means that the risk of failure is great, expenditure of resources is large and the time scales for implementation is long. The electromotive force was expressed as the political motivation of the potential proliferators, such as an unauthorized party or a national group to acquire nuclear weapons. The electrical current was then defined as a proliferation resistance index. There are two electrical circuit models used in the evaluation of the proliferation resistance: the series and the parallel circuits. In the series circuit model of the proliferation resistance, a potential proliferator has to overcome all resistance barriers to achieve the manufacturing of the nuclear weapons. This phenomenon could be explained by the fact that the IAEA(International Atomic Energy Agency)'s safeguards philosophy relies on the defense-in-depth principle against nuclear proliferation at a specific facility. The parallel circuit model was also used to imitate the risk of proliferation for

  10. Quantitative modeling of clinical, cellular, and extracellular matrix variables suggest prognostic indicators in cancer: a model in neuroblastoma.

    Science.gov (United States)

    Tadeo, Irene; Piqueras, Marta; Montaner, David; Villamón, Eva; Berbegall, Ana P; Cañete, Adela; Navarro, Samuel; Noguera, Rosa

    2014-02-01

    Risk classification and treatment stratification for cancer patients is restricted by our incomplete picture of the complex and unknown interactions between the patient's organism and tumor tissues (transformed cells supported by tumor stroma). Moreover, all clinical factors and laboratory studies used to indicate treatment effectiveness and outcomes are by their nature a simplification of the biological system of cancer, and cannot yet incorporate all possible prognostic indicators. A multiparametric analysis on 184 tumor cylinders was performed. To highlight the benefit of integrating digitized medical imaging into this field, we present the results of computational studies carried out on quantitative measurements, taken from stromal and cancer cells and various extracellular matrix fibers interpenetrated by glycosaminoglycans, and eight current approaches to risk stratification systems in patients with primary and nonprimary neuroblastoma. New tumor tissue indicators from both fields, the cellular and the extracellular elements, emerge as reliable prognostic markers for risk stratification and could be used as molecular targets of specific therapies. The key to dealing with personalized therapy lies in the mathematical modeling. The use of bioinformatics in patient-tumor-microenvironment data management allows a predictive model in neuroblastoma.

  11. Toward University Modeling Instruction—Biology: Adapting Curricular Frameworks from Physics to Biology

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-01-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence. PMID:23737628

  12. Toward university modeling instruction--biology: adapting curricular frameworks from physics to biology.

    Science.gov (United States)

    Manthey, Seth; Brewe, Eric

    2013-06-01

    University Modeling Instruction (UMI) is an approach to curriculum and pedagogy that focuses instruction on engaging students in building, validating, and deploying scientific models. Modeling Instruction has been successfully implemented in both high school and university physics courses. Studies within the physics education research (PER) community have identified UMI's positive impacts on learning gains, equity, attitudinal shifts, and self-efficacy. While the success of this pedagogical approach has been recognized within the physics community, the use of models and modeling practices is still being developed for biology. Drawing from the existing research on UMI in physics, we describe the theoretical foundations of UMI and how UMI can be adapted to include an emphasis on models and modeling for undergraduate introductory biology courses. In particular, we discuss our ongoing work to develop a framework for the first semester of a two-semester introductory biology course sequence by identifying the essential basic models for an introductory biology course sequence.

  13. Evaluating the impact of strategic personnel policies using a MILP model: The public university case

    Energy Technology Data Exchange (ETDEWEB)

    Torre, R. de la; Lusa, A.; Mateo, M.

    2016-07-01

    Purpose: The main purpose of the paper is to evaluate the impact of diverse personnel policies around personnel promotion in the design of the strategic staff plan for a public university. The strategic staff planning consists in the determination of the size and composition of the workforce for an organization. Design/methodology/approach: The staff planning is solved using a Mixed Integer Linear Programming (MILP) model. The MILP model represents the organizational structure of the university, the personnel categories and capacity decisions, the demand requirements, the required service level and budget restrictions. All these aspects are translated into a set of data, as well as the parameters and constraints building up the mathematical model for optimization. The required data for the model is adopted from a Spanish public university. Findings: The development of appropriate policies for personnel promotion can effectively reduce the number of dismissals while proposing a transition towards different preferable workforce structures in the university. Research limitations/implications: The long term staff plan for the university is solved by the MILP model considering a time horizon of 8 years. For this time horizon, the required input data is derived from current data of the university. Different scenarios are proposed considering different temporal trends for input data, such as in demand and admissible promotional ratios for workers. Originality/value: The literature review reports a lack of formalized procedures for staff planning in universities taking into account, at the same time, the regulations on hiring, dismissals, promotions and the workforce heterogeneity, all considered to optimize workforce size and composition addressing not only an economic criteria, but also the required workforce expertise and the quality in the service offered. This paper adopts a formalized procedure developed by the authors in previous works, and exploits it to assess the

  14. Evaluating the impact of strategic personnel policies using a MILP model: The public university case

    International Nuclear Information System (INIS)

    Torre, R. de la; Lusa, A.; Mateo, M.

    2016-01-01

    Purpose: The main purpose of the paper is to evaluate the impact of diverse personnel policies around personnel promotion in the design of the strategic staff plan for a public university. The strategic staff planning consists in the determination of the size and composition of the workforce for an organization. Design/methodology/approach: The staff planning is solved using a Mixed Integer Linear Programming (MILP) model. The MILP model represents the organizational structure of the university, the personnel categories and capacity decisions, the demand requirements, the required service level and budget restrictions. All these aspects are translated into a set of data, as well as the parameters and constraints building up the mathematical model for optimization. The required data for the model is adopted from a Spanish public university. Findings: The development of appropriate policies for personnel promotion can effectively reduce the number of dismissals while proposing a transition towards different preferable workforce structures in the university. Research limitations/implications: The long term staff plan for the university is solved by the MILP model considering a time horizon of 8 years. For this time horizon, the required input data is derived from current data of the university. Different scenarios are proposed considering different temporal trends for input data, such as in demand and admissible promotional ratios for workers. Originality/value: The literature review reports a lack of formalized procedures for staff planning in universities taking into account, at the same time, the regulations on hiring, dismissals, promotions and the workforce heterogeneity, all considered to optimize workforce size and composition addressing not only an economic criteria, but also the required workforce expertise and the quality in the service offered. This paper adopts a formalized procedure developed by the authors in previous works, and exploits it to assess the

  15. Dynamic Universe Model Predicts the Trajectory of New Horizons Satellite Going to Pluto.......

    Science.gov (United States)

    Naga Parameswara Gupta, Satyavarapu

    2012-07-01

    New Horizons is NASA's artificial satellite now going towards to the dwarf planet Pluto. It has crossed Jupiter. It is expected to be the rst spacecraft to go near and study Pluto and its moons, Charon, Nix, and Hydra. These are the predictions for New Horizons (NH) space craft as on A.D. 2009-Aug-09 00:00:00.0000 hrs. The behavior of NH is similar to Pioneer Space craft as NH traveling is alike to Pioneer. NH is supposed to reach Pluto in 2015 AD. There was a gravity assist taken at Jupiter about a year back. As Dynamic universe model explains Pioneer anomaly and the higher gravitational attraction forces experienced towards SUN, It can explain NH also in a similar fashion. I am giving the predictions for NH by Dynamic Universe Model in the following Table 4. Here first two rows give Dynamic Universe Model predictions based on 02-01-2009 00:00 hrs data with Daily time step and hourly time step. Third row gives Ephemeris from Jet propulsion lab.Dynamic Universe Model can predict further to 9-Aug-2009. These Ephemeris data is from their web as on 28th June 2009 Any new data can be calculated..... For finding trajectories of Pioneer satellite (Anomaly), New Horizons satellite going to Pluto, the Calculations of Dynamic Universe model can be successfully applied. No dark matter is assumed within solar system radius. The effect on the masses around SUN shows as though there is extra gravitation pull toward SUN. It solves the Dynamics of Extra-solar planets like Planet X, satellite like Pioneer and NH for 3-Position, 3-velocity 3-acceleration for their masses,considering the complex situation of Multiple planets, Stars, Galaxy parts and Galaxy center and other Galaxies Using simple Newtonian Physics. It already solved problems Missing mass in Galaxies observed by galaxy circular velocity curves successfully. `SITA Simulations' software was developed about 18 years back for Dynamic Universe Model of Cosmology. It is based on Newtonian physics. It is Classical singularity

  16. THE C.A.N.O.A. MODEL - A POSSIBLE IMPLEMENTATION IN ROMANIAN UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    Elena HLACIUC

    2017-06-01

    Full Text Available Globalisation, in addition to the many effects it has in all areas, determines, in what concerns higher education, a fierce competition between universities worldwide. This competition requires, as an essential element, that in addition to the services offered by the universities, they also develop tools to reveal their costs. Academic and financial performance are the two measures of the management of a university. Accounting supports the management of a university by its three facets, which form, together, the institution's accounting information system: budget implementation accounting, financial accounting, and management accounting. However, while budget implementation and financial accounting are well represented in Romania, the same cannot be said about management accounting. In this paper we shall analyse a possible application of management accounting in Romanian universities, using the C.A.N.O.A. model, a method that is currently used in Spain.

  17. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  18. Quantitative Modeling of Membrane Transport and Anisogamy by Small Groups Within a Large-Enrollment Organismal Biology Course

    Directory of Open Access Journals (Sweden)

    Eric S. Haag

    2016-12-01

    Full Text Available Quantitative modeling is not a standard part of undergraduate biology education, yet is routine in the physical sciences. Because of the obvious biophysical aspects, classes in anatomy and physiology offer an opportunity to introduce modeling approaches to the introductory curriculum. Here, we describe two in-class exercises for small groups working within a large-enrollment introductory course in organismal biology. Both build and derive biological insights from quantitative models, implemented using spreadsheets. One exercise models the evolution of anisogamy (i.e., small sperm and large eggs from an initial state of isogamy. Groups of four students work on Excel spreadsheets (from one to four laptops per group. The other exercise uses an online simulator to generate data related to membrane transport of a solute, and a cloud-based spreadsheet to analyze them. We provide tips for implementing these exercises gleaned from two years of experience.

  19. A bibliography of terrain modeling (geomorphometry), the quantitative representation of topography: supplement 4.0

    Science.gov (United States)

    Pike, Richard J.

    2002-01-01

    Terrain modeling, the practice of ground-surface quantification, is an amalgam of Earth science, mathematics, engineering, and computer science. The discipline is known variously as geomorphometry (or simply morphometry), terrain analysis, and quantitative geomorphology. It continues to grow through myriad applications to hydrology, geohazards mapping, tectonics, sea-floor and planetary exploration, and other fields. Dating nominally to the co-founders of academic geography, Alexander von Humboldt (1808, 1817) and Carl Ritter (1826, 1828), the field was revolutionized late in the 20th Century by the computer manipulation of spatial arrays of terrain heights, or digital elevation models (DEMs), which can quantify and portray ground-surface form over large areas (Maune, 2001). Morphometric procedures are implemented routinely by commercial geographic information systems (GIS) as well as specialized software (Harvey and Eash, 1996; Köthe and others, 1996; ESRI, 1997; Drzewiecki et al., 1999; Dikau and Saurer, 1999; Djokic and Maidment, 2000; Wilson and Gallant, 2000; Breuer, 2001; Guth, 2001; Eastman, 2002). The new Earth Surface edition of the Journal of Geophysical Research, specializing in surficial processes, is the latest of many publication venues for terrain modeling. This is the fourth update of a bibliography and introduction to terrain modeling (Pike, 1993, 1995, 1996, 1999) designed to collect the diverse, scattered literature on surface measurement as a resource for the research community. The use of DEMs in science and technology continues to accelerate and diversify (Pike, 2000a). New work appears so frequently that a sampling must suffice to represent the vast literature. This report adds 1636 entries to the 4374 in the four earlier publications1. Forty-eight additional entries correct dead Internet links and other errors found in the prior listings. Chronicling the history of terrain modeling, many entries in this report predate the 1999 supplement

  20. Statistical Modeling Approach to Quantitative Analysis of Interobserver Variability in Breast Contouring

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jinzhong, E-mail: jyang4@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Woodward, Wendy A.; Reed, Valerie K.; Strom, Eric A.; Perkins, George H.; Tereffe, Welela; Buchholz, Thomas A. [Department of Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Zhang, Lifei; Balter, Peter; Court, Laurence E. [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin (United States); Dong, Lei [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Scripps Proton Therapy Center, San Diego, California (United States)

    2014-05-01

    Purpose: To develop a new approach for interobserver variability analysis. Methods and Materials: Eight radiation oncologists specializing in breast cancer radiation therapy delineated a patient's left breast “from scratch” and from a template that was generated using deformable image registration. Three of the radiation oncologists had previously received training in Radiation Therapy Oncology Group consensus contouring for breast cancer atlas. The simultaneous truth and performance level estimation algorithm was applied to the 8 contours delineated “from scratch” to produce a group consensus contour. Individual Jaccard scores were fitted to a beta distribution model. We also applied this analysis to 2 or more patients, which were contoured by 9 breast radiation oncologists from 8 institutions. Results: The beta distribution model had a mean of 86.2%, standard deviation (SD) of ±5.9%, a skewness of −0.7, and excess kurtosis of 0.55, exemplifying broad interobserver variability. The 3 RTOG-trained physicians had higher agreement scores than average, indicating that their contours were close to the group consensus contour. One physician had high sensitivity but lower specificity than the others, which implies that this physician tended to contour a structure larger than those of the others. Two other physicians had low sensitivity but specificity similar to the others, which implies that they tended to contour a structure smaller than the others. With this information, they could adjust their contouring practice to be more consistent with others if desired. When contouring from the template, the beta distribution model had a mean of 92.3%, SD ± 3.4%, skewness of −0.79, and excess kurtosis of 0.83, which indicated a much better consistency among individual contours. Similar results were obtained for the analysis of 2 additional patients. Conclusions: The proposed statistical approach was able to measure interobserver variability quantitatively